You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/01/14 16:16:42 UTC

Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #585

See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/585/display/redirect?page=changes>

Changes:

[blais] python sdk examples: Fixed typo in wordcount example.

[noreply] [BEAM-13480] Increase pipeline timeout for

[noreply] Stronger typing inference for CoGBK. (#16465)

[noreply] [BEAM-12464] Change ProtoSchemaTranslator beam schema creation to match

[noreply] Introduce the notion of a JoinIndex for fewer shuffles. (#16101)

[noreply] Merge pull request #16467 from [BEAM-12164]: SpannerIO

[noreply] Merge pull request #16385 from [BEAM-13535] [Playground] add cancel

[noreply] Merge pull request #16485 from [BEAM-13486] [Playground] For unit tests

[heejong] [BEAM-13455] Remove duplicated artifacts when using multiple

[noreply] [BEAM-12572] Run java examples on multiple runners (#16450)


------------------------------------------
[...truncated 55.41 KB...]
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started

> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644172 sha256=8f26bb3170eb3b90b51648636e3f9ad8b11d067edcf37e316065bc4b7ec2abf6
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.35 botocore-1.23.35 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.16.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.4.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642175438.649805/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220114155038650789-2982'
 createTime: '2022-01-14T15:50:45.150161Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-14_07_50_44-11596834051163246981'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0114150528'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-14T15:50:45.150161Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-14_07_50_44-11596834051163246981]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-14_07_50_44-11596834051163246981
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-14_07_50_44-11596834051163246981?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-14_07_50_44-11596834051163246981 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:52.188Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.725Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.778Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.822Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.859Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.891Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.920Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.964Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:53.997Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.025Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.048Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.082Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.138Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.198Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.287Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.318Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.342Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.374Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.429Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.457Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:50:54.487Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:51:21.304Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:51:42.920Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:52:08.712Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T15:52:08.736Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-14_07_50_44-11596834051163246981 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c07dde7cfbc54ee0be00bc15e9e14e86 and timestamp: 1642176207.338159:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 96
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0114150528.1642176211.777415/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220114160331778321-8520'
 createTime: '2022-01-14T16:03:38.923662Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-14_08_03_38-2118236556471501155'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0114150528'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-14T16:03:38.923662Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-14_08_03_38-2118236556471501155]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-14_08_03_38-2118236556471501155
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-14_08_03_38-2118236556471501155?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-14_08_03_38-2118236556471501155 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:45.906Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.077Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.107Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.162Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.250Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.284Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.342Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.432Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.471Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.498Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.531Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.555Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.580Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.630Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.653Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.686Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.740Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.795Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.829Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.851Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.884Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.911Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.944Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:47.979Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:48.033Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:48.058Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:03:48.092Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:04:10.061Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:04:33.327Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:04:57.666Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-14T16:04:57.696Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-14_08_03_38-2118236556471501155 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 61c56e7c010c4523a7409e1a16fd5d5b and timestamp: 1642176999.0555463:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 61c56e7c010c4523a7409e1a16fd5d5b and timestamp: 1642176999.0555463:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-14_07_50_44-11596834051163246981?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-14_08_03_38-2118236556471501155?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_e39edcda-9119-4299-a881-71820e7badf2_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 18s
92 actionable tasks: 58 executed, 32 from cache, 2 up-to-date

Publishing build scan...
https://scans.gradle.com/s/4si2vexjx4tte

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #837

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/837/display/redirect?page=changes>

Changes:

[noreply] lint fixes to go (#23351)

[noreply] Bump cloud.google.com/go/bigquery from 1.41.0 to 1.42.0 in /sdks


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2f1f1a76419a032ea2d70671e3c6c9fe86b0626f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2f1f1a76419a032ea2d70671e3c6c9fe86b0626f # timeout=10
Commit message: "Bump cloud.google.com/go/bigquery from 1.41.0 to 1.42.0 in /sdks (#23329)"
 > git rev-list --no-walk 90739533a8c84a9354197cac435c85b4ba002344 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8932832964821614262.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0924150420 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/irzuvmwmzwlmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #836

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/836/display/redirect?page=changes>

Changes:

[Steve Niemitz] use avro DataFileReader to read avro container files

[noreply] Change google_cloud_bigdataoss_version to 2.2.8. (#23300)

[Moritz Mack] Fix Nexmark default log level

[noreply] Bump cloud.google.com/go/storage from 1.26.0 to 1.27.0 in /sdks (#23336)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 90739533a8c84a9354197cac435c85b4ba002344 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 90739533a8c84a9354197cac435c85b4ba002344 # timeout=10
Commit message: "Bump cloud.google.com/go/storage from 1.26.0 to 1.27.0 in /sdks (#23336)"
 > git rev-list --no-walk 762edd7f3a64f076dbee156fa48b8a7e5e6a512f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5669222698800813259.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0923150424 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/k3e62fz4konsu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #835

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/835/display/redirect?page=changes>

Changes:

[bvolpato] Do not use .get() on ValueProvider during pipeline creation

[noreply] [Java SDK core] emit watermark from PeriodicSequence (#23301) (#23302)

[noreply] Extend protocol in windmill.proto used by google-cloud-dataflow-java

[noreply] Allow longer Class-Path entries (#23269)

[noreply] Improved pipeline translation in SparkStructuredStreamingRunner (#22446)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 762edd7f3a64f076dbee156fa48b8a7e5e6a512f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 762edd7f3a64f076dbee156fa48b8a7e5e6a512f # timeout=10
Commit message: "Improved pipeline translation in SparkStructuredStreamingRunner (#22446)"
 > git rev-list --no-walk d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8719143650097081785.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0922150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fo5ixy5indbk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #834

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/834/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d578e3df7c963e57f251fb27739fbc1d3811e722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
Commit message: "[BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)"
 > git rev-list --no-walk d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1854098947061856879.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0921150420 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5embazaj62ufw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #833

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/833/display/redirect?page=changes>

Changes:

[Pablo Estrada] Revert "Trying out property-based tests for Beam python coders (#22233)"

[noreply] Bump google.golang.org/api from 0.95.0 to 0.96.0 in /sdks (#23246)

[noreply] [Go SDK] Add timer coder support (#23222)

[noreply] Fix wrong comment (#23272)

[noreply] [Playground] [Backend] Cache component for playground examples (#22869)

[noreply] [BEAM-13416] Introduce Schema provider for AWS models and deprecate low

[noreply] [BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d578e3df7c963e57f251fb27739fbc1d3811e722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d578e3df7c963e57f251fb27739fbc1d3811e722 # timeout=10
Commit message: "[BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)"
 > git rev-list --no-walk 5520fe064fc3b7196998d4597746119691eb6681 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6531923545032597377.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0920150434 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q3aadpbdhth4c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #832

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/832/display/redirect?page=changes>

Changes:

[noreply] Enable verbose output for RAT Precommit (#23279)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5520fe064fc3b7196998d4597746119691eb6681 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5520fe064fc3b7196998d4597746119691eb6681 # timeout=10
Commit message: "Enable verbose output for RAT Precommit (#23279)"
 > git rev-list --no-walk f477b85f230ebb5dbd6b62540da078a33e3318ce # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3850974803811121245.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0919150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lu7prop5mzhc2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #831

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/831/display/redirect?page=changes>

Changes:

[noreply] Add drop_example flag to the RunInference and Model Handler (#23266)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f477b85f230ebb5dbd6b62540da078a33e3318ce (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f477b85f230ebb5dbd6b62540da078a33e3318ce # timeout=10
Commit message: "Add drop_example flag to the RunInference and Model Handler (#23266)"
 > git rev-list --no-walk 8754cc0904872d37edbb8b4d3b8d9f92aad94acc # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7367883017291103710.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0918150402 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/captrhyz3ok7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #830

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/830/display/redirect?page=changes>

Changes:

[noreply] TensorRT Initial commit (#22131)

[noreply] Fix Kafka performance test sourceOption to match expected hash (#23274)

[noreply] updated the pydoc for running a custom model on Beam (#23218)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8754cc0904872d37edbb8b4d3b8d9f92aad94acc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8754cc0904872d37edbb8b4d3b8d9f92aad94acc # timeout=10
Commit message: "updated the pydoc for running a custom model on Beam (#23218)"
 > git rev-list --no-walk 8b2676782a62f8bdf912395267056c9f37251338 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2966784103559661964.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0917150412 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ruxk63ctg3h76

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #829

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/829/display/redirect?page=changes>

Changes:

[noreply] Revert "Exclude protobuf 3.20.2" (#23237)

[noreply] Fix outdated code in python sdk install (#23231)

[noreply] Bump up dataflow python container version to beam-master-20220914

[noreply] Improve the performance of TextSource by reducing how many byte[]s are

[noreply] Issue#21430 Avoid pruning DataframeTransforms (#23069)

[noreply] Bump cloud.google.com/go/bigquery from 1.40.0 to 1.41.0 in /sdks

[noreply] [Website] Correct spelling of structural (#23225)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8b2676782a62f8bdf912395267056c9f37251338 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8b2676782a62f8bdf912395267056c9f37251338 # timeout=10
Commit message: "[Website] Correct spelling of structural (#23225)"
 > git rev-list --no-walk 6911520a5165f26a6966a54dd369e07764e6334c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8720898012671886377.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0916150418 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v34x7qxiltkgs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #828

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/828/display/redirect?page=changes>

Changes:

[noreply] Fix assignees check

[noreply] Update cibuildwheel (#23024)

[noreply] Add section to docs on resource hints/RunInference (#23215)

[noreply] (BQ Python) Perform job waits in finish_bundle to allow BQ streaming

[noreply] Update to newest version of CloudPickle. (#23223)

[bulat.safiullin] [Website] update site navigation  #22902

[noreply] Resolve script parsing error when changing from bash to sh. (#23199)

[noreply] Bump cloud.google.com/go/bigquery from 1.39.0 to 1.40.0 in /sdks

[noreply] Bump github.com/google/go-cmp from 0.5.8 to 0.5.9 in /sdks (#23123)

[noreply] Update google-cloud-bigquery requirement from <3,>=1.6.0 to >=1.6.0,<4

[noreply] Optimize varint reading and writing for small ints. (#23192)

[noreply] Pass namespace through RunInference transform (#23182)

[noreply] [GitHub Actions] - INFRA scripts to implement GCP Self-hosted runners

[noreply] GA migration - Base actions to use for precommit and postcommit

[noreply] Test fix Kafka Performance test batch (#23191)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6911520a5165f26a6966a54dd369e07764e6334c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6911520a5165f26a6966a54dd369e07764e6334c # timeout=10
Commit message: "Test fix Kafka Performance test batch (#23191)"
 > git rev-list --no-walk 66bbee84ed477d86008905646e68b100591b6f78 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3630290691538011520.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0915150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tclpmcyyvwcho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #827

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/827/display/redirect?page=changes>

Changes:

[noreply] Open Allow and test pyarrow 8.x and 9.x (#22997)

[noreply] (BQ Python) Pass project field from options or parameter when writing

[noreply] Update python-machine-learning.md (#23209)

[noreply] Pin the version of cloudpickle to 2.1.x (#23120)

[noreply] Add streaming test for Write API sink (#21903)

[noreply] [Go SDK] Proto changes for timer param (#23216)

[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#23201)

[noreply] Update to objsize to 0.5.2 which is under BSD-3 license (fixes #23096)

[noreply] Exclude insignificant whitespace from cloud object (#23217)

[noreply] Trying out property-based tests for Beam python coders (#22233)

[noreply] Publish results of JMH benchmark runs (Java SDK) to InfluxDB (part of

[noreply] Exclude protobuf 3.20.2 (#23226)

[noreply] Fix IllegalStateException in StorageApiWriteUnshardedRecords error


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 66bbee84ed477d86008905646e68b100591b6f78 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 66bbee84ed477d86008905646e68b100591b6f78 # timeout=10
Commit message: "Fix IllegalStateException in StorageApiWriteUnshardedRecords error handling. (#23205)"
 > git rev-list --no-walk c654e41cb40acad026a2a4665383b60c0227f694 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins528259775677710053.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0914150957 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mssb4hdbbwyak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #826

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/826/display/redirect?page=changes>

Changes:

[noreply] Bump dataflow java fnapi container version to beam-master-20220830

[noreply] [Issue#23071] Fix AfterProcessingTime for Python to behave like Java

[noreply] Don't depend on java 11 docker container for go test (#23197)

[Moritz Mack] Annotate stateful VR test in TestStreamTest with UsesStatefulParDo

[Moritz Mack] Properly close Spark (streaming) context if Pipeline translation fails

[noreply] [Playground] [Backend] Datastore queries and mappers to get precompiled


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c654e41cb40acad026a2a4665383b60c0227f694 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c654e41cb40acad026a2a4665383b60c0227f694 # timeout=10
Commit message: "[Playground] [Backend] Datastore queries and mappers to get precompiled objects (#22868)"
 > git rev-list --no-walk 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6737678629944130951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0913150423 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2ury7jc46nrpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #825

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/825/display/redirect?page=changes>

Changes:

[noreply] [TPC-DS] Use common queries argument for Jenkins jobs (#23139)

[noreply] pubsublite: Reduce commit logspam (#22762)

[noreply] Added documentation in ACTIONS.md file (#23159)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2113ffcac3fa3d7522ceb22d03919e6edafe5e90 # timeout=10
Commit message: "Added documentation in ACTIONS.md file (#23159)"
 > git rev-list --no-walk 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4858410426911906550.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0912150428 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w2zze3npaz4ku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #824

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/824/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1526ca8c4cc6d58b3c28d816fc2597e51603d75f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
Commit message: "Improvements to SchemaTransform implementations for BQ and Kafka (#23045)"
 > git rev-list --no-walk 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7160520429474833951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0911150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/47dichaeys4ri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #823

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/823/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update shortcode languages from duplicate go to typescript

[cushon] Use a ClassLoadingStrategy that is compatible with Java 17+

[noreply] [Website] update case-studies logo images #22799 (#22793)

[noreply] [Website] change media-query max-width variable to ak-breakpoint-xl

[noreply] [Website] add overflow to code tags #22888 (#22427)

[noreply] Clean up Kafka Cluster and pubsub topic in rc validation script (#23021)

[noreply] Fix assertions in the Spanner IO IT tests (#23098)

[noreply] Use existing pickle_library flag in expansion service. (#23111)

[noreply] Assert pipeline results in performance tests (#23027)

[noreply] Consolidate Samza TranslationContext and PortableTranslationContext

[noreply] Improvements to SchemaTransform implementations for BQ and Kafka


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1526ca8c4cc6d58b3c28d816fc2597e51603d75f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1526ca8c4cc6d58b3c28d816fc2597e51603d75f # timeout=10
Commit message: "Improvements to SchemaTransform implementations for BQ and Kafka (#23045)"
 > git rev-list --no-walk 5734d3e3af68a22aa5a893d3cb9b138990b22911 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2353090329647728027.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0910150413 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zora55qdjxpkm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #822

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/822/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add paddings to pillars-item, change styles of footer logos

[bulat.safiullin] [Website] add table-container-wrapper #22896

[yathu] Decrease derby.locks.waitTimeout in jdbc unit test

[noreply] Auto-cancel old unit test Actions Runs (#23095)

[noreply] Merge pull request #23092 Cross-language tests in github actions.

[noreply] Update CHANGES.md for 2.42.0 cut, and add 2.43.0 section (#23108)

[noreply] remove `"io/ioutil"` package (#23001)

[noreply] Add one NER example to use a spaCy model with RunInference (#23035)

[noreply] Bump google.golang.org/api from 0.94.0 to 0.95.0 in /sdks (#23062)

[noreply] Implement JsonUtils (#22771)

[noreply] Support models returning a dictionary of outputs (#23087)

[noreply] [TPC-DS] Store metrics into BigQuery and InfluxDB (#22545)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5734d3e3af68a22aa5a893d3cb9b138990b22911 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5734d3e3af68a22aa5a893d3cb9b138990b22911 # timeout=10
Commit message: "Merge pull request #2281: [Website] update homepage mobile styles"
 > git rev-list --no-walk 9efa3787aefe9198c7985dd30b16691cdba61a7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6150213836930741780.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0909150426 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yww32ylwpjkfw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #821

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/821/display/redirect?page=changes>

Changes:

[clementg] allow non-lts jvm version, fallback on java 11 for runner

[clementg] Add a stricter java version method

[clementg] fall back to the nearest lts version

[noreply] Keep stale action from closing issues (#23067)

[Robert Bradshaw] Use cloudpickle for Java Python transforms.

[noreply] Merge pull request #22996: [BEAM-11205] Update GCP Libraries BOM

[Robert Burke] Moving to 2.43.0-SNAPSHOT on master branch.

[noreply] clean up comments and register functional DoFn in wordcount.go (#23057)

[noreply] [Tour Of Beam][backend] integration tests and GA workflow (#23032)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9efa3787aefe9198c7985dd30b16691cdba61a7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9efa3787aefe9198c7985dd30b16691cdba61a7e # timeout=10
Commit message: "[Tour Of Beam][backend] integration tests and GA workflow (#23032)"
 > git rev-list --no-walk 0d937d4cd725965572d4720811fa2d6efaa8edf8 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8557624825898245103.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0908150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q32gufpst2rqg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #820

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/820/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Cosmetic checkstyle fix to TextRowCountEstimator

[Kenneth Knowles] Upgrade to Gradle 7.5.1

[Brian Hulette] Use typehints in benchmark utilities

[oleg.borisevich] fixing condition for db index creation

[Robert Bradshaw] Allow expansion service to choose pickler.

[noreply] Disable singleIterate (#23042)

[Robert Bradshaw] Accept "default" as pickler library.

[Robert Bradshaw] Clarifying comment.

[Heejong Lee] [BEAM-22856] PythonService Beam version compatibility

[chamikaramj] Fixes RunInference test failure

[noreply] Bump github.com/lib/pq from 1.10.6 to 1.10.7 in /sdks (#23061)

[noreply] Allowing more flexible precision for TIMESTAMP, DATETIME fields in

[noreply] Reenable run-inference tests on windows (#23044)

[noreply] [BEAM-12164] Support new value capture types NEW_ROW NEW_VALUES for s…

[noreply] Fix example registration input arity (#23059)

[noreply] Clarify inference example docs (#23018)

[noreply] [Playground] [Backend] Datastore queries and mappers to get examples


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0d937d4cd725965572d4720811fa2d6efaa8edf8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0d937d4cd725965572d4720811fa2d6efaa8edf8 # timeout=10
Commit message: "[Playground] [Backend] Datastore queries and mappers to get examples (#22955)"
 > git rev-list --no-walk ca9ee909e57e36f0027001f1c101852378105490 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins187943057147427115.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0907150402 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xbacdsf6xeele

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #819

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/819/display/redirect?page=changes>

Changes:

[noreply] Revert "Remove subprocess.PIPE usage by using a temp file (#22654)"

[noreply] Allow users to pass classloader to dynamically load JDBC drivers.

[noreply] Fix withCheckStopReadingFn to not cause the pipeline to crash (#22962)

[noreply] Inference benchmark tests (#21738)

[noreply] [Go SDK]: Add support for Google Cloud Profiler for pipelines (#22824)

[noreply] Listen to window messages to switch SDK and to load content (#22959)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ca9ee909e57e36f0027001f1c101852378105490 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ca9ee909e57e36f0027001f1c101852378105490 # timeout=10
Commit message: "Listen to window messages to switch SDK and to load content (#22959)"
 > git rev-list --no-walk 3c91e7b24a53a6a5b929ede58231bbc57c9ddced # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7085751590207907133.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0906150450 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 51

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q53xsqqecp4uo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #818

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/818/display/redirect?page=changes>

Changes:

[noreply] Generalize interface of InfluxDBPublisher to support more use cases


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3c91e7b24a53a6a5b929ede58231bbc57c9ddced (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3c91e7b24a53a6a5b929ede58231bbc57c9ddced # timeout=10
Commit message: "Generalize interface of InfluxDBPublisher to support more use cases (#22238) (#22260)"
 > git rev-list --no-walk 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins81936991881343306.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0905150358 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cxehwsovm7zfq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #817

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/817/display/redirect?page=changes>

Changes:

[noreply] [#19857] Migrate to using a memory aware cache within the Python SDK


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 25c6ed74c9846c89a92655c1e8d313ef87d6adb1 # timeout=10
Commit message: "[#19857] Migrate to using a memory aware cache within the Python SDK harness (#22924)"
 > git rev-list --no-walk 31561e2ff13147aa80f9f811e2a94ebe57b25374 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5062128391965480229.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0904150405 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rfvjfvxuxj7fq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #816

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/816/display/redirect?page=changes>

Changes:

[noreply] [Tour of Beam]: Welcome Screen frontend layout (#22794)

[noreply] Remove redundant testEventTimeTimerSetWithinAllowedLateness sickbay

[noreply] Adding support for Beam Schema Rows with BQ DIRECT_READ (#22926)

[noreply] Add java Bigquery IO known issue to beam 2.40 release blogpost (#22611)

[noreply] Update playground_deploy_examples.yml

[noreply] Add run-inference component for autolabeling (#22971)

[noreply] [Playground] [Infrastructure] Deleting the Cloud Storage Client (#22722)

[noreply] Updates Java RunInference to infer Python dependencies when possible

[noreply] Adding TensorFlow support to the Machine Learning overview page (#22949)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 31561e2ff13147aa80f9f811e2a94ebe57b25374 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 31561e2ff13147aa80f9f811e2a94ebe57b25374 # timeout=10
Commit message: "Adding TensorFlow support to the Machine Learning overview page (#22949)"
 > git rev-list --no-walk 4b46ef40289ddf33aac1aac0ca6741d96407bd3b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5349387671859956702.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0903150348 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4xgkjoh3xej3u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #815

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/815/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Update proto generation script due to BEAM-13939.

[Robert Bradshaw] Regenerate typescript protos.

[noreply] Add initial read_gbq wrapper (#22616)

[noreply] Minor: Fix lint failure (#22998)

[noreply] [Tour Of Beam][backend] get unit content (#22967)

[noreply] Allows to use databaseio with postgres driver (#22941)

[noreply] Bump cloud.google.com/go/storage from 1.25.0 to 1.26.0 in /sdks (#22954)

[noreply] [BEAM-22859] Allow the specification of extra packages for external


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b46ef40289ddf33aac1aac0ca6741d96407bd3b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b46ef40289ddf33aac1aac0ca6741d96407bd3b # timeout=10
Commit message: "[BEAM-22859] Allow the specification of extra packages for external Python transforms. (#22991)"
 > git rev-list --no-walk 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1441962382760042168.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0902150453 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rmf3s3skyk7s4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #814

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/814/display/redirect?page=changes>

Changes:

[Brian Hulette] Extract utilities in dataframe.schemas

[Brian Hulette] Add pandas_type_compatibility with pandas BatchConverter implementations

[Brian Hulette] Use Batched DoFns at DataFrame API boundaries

[Brian Hulette] Move dtype conversion to pandas_type_compatibility

[Brian Hulette] Always register pandas BatchConverters

[Brian Hulette] Fix interactive runner tests

[Brian Hulette] Use pandas_type_compatibility BatchConverters for dataframe.schemas

[Brian Hulette] Skip test cases broken in pandas 1.1.x

[Brian Hulette] Address review comments

[Brian Hulette] yapf, typo in test

[noreply] Add ability to remove/clear map and set state (#22938)

[Brian Hulette] Add test to reproduce https://github.com/apache/beam/issues/22854

[Brian Hulette] Exercise row coder with nested optional struct

[Brian Hulette] Make RowTypeConstraint callable

[Brian Hulette] Add test to exercise RowTypeConstraint.__call__

[noreply] Fix gpu to cpu conversion with warning logs (#22795)

[noreply] Add Go stateful DoFns to CHANGES.md and fix linting violations (#22958)

[noreply] 22805: Upgrade Jackson version from 2.13.0 to 2.13.3 (#22806)

[noreply] Run cred rotation every month (#22977)

[noreply] [BEAM-12164] Synchronize access queue in ThroughputEstimator and

[noreply] Add some explanatory comments to the wordcount registration (#22989)

[noreply] Move Go examples under the cookbook directory to generic registration

[noreply] Improve BQ test utils to support JSON in a more simple manner (#22942)

[noreply] [fixes #22980] Migrate BeamFnLoggingClient to the new execution state


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2df47e7657ca2a9c3fd7b3c3fb578913d4ec4ec1 # timeout=10
Commit message: "[fixes #22980] Migrate BeamFnLoggingClient to the new execution state sampler. (#22981)"
 > git rev-list --no-walk d615b624e9ff211e857d026d541c4d56fd18e2d3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9070524098383077627.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0901150439 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 35s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ibpi4e46kwurs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #813

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/813/display/redirect?page=changes>

Changes:

[noreply] Minor: Fix option_from_runner_api typehint (#22946)

[noreply] Filter out unsupported state tests (#22963)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d615b624e9ff211e857d026d541c4d56fd18e2d3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d615b624e9ff211e857d026d541c4d56fd18e2d3 # timeout=10
Commit message: "Filter out unsupported state tests (#22963)"
 > git rev-list --no-walk 3ede5b76e48b41e89bc67541ea5044ebe704e905 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6572178010655779109.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0831150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rcujv66224uz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #812

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/812/display/redirect?page=changes>

Changes:

[yathu] Support Timestamp type in xlang JDBC Read and Write

[yathu] change urn name to millis_instant:v1

[yathu] Add standard_coders test

[yathu] Apply suggestions from code review

[yathu] Fix Java standard coder test

[yathu] Fix logical type with same language type gets completely hidden

[Robert Bradshaw] [BEAM-22923] Allow sharding specification for dataframe writes.

[noreply] [Playground] Update build_playground_backend.yml - add "Index creation"

[noreply] [Playground] [Backend] added SDK validation to save a code snippet

[noreply] Fix linting violations (#22934)

[noreply] [akvelon][tour-of-beam] backend bootstraps (#22556)

[noreply] Bump up postcommit timeout (#22937)

[noreply] Handle stateful windows correctly + integration test (#22918)

[noreply] Automatically infer state keys from their field name (#22922)

[noreply] Updates to multi-lang Java quickstart (#22927)

[noreply] Fix yaml duplicated mapping key (#22952)

[noreply] [Playground] [Infrastructure] Adding the Cloud Datastore client to save

[noreply] Fix jdbc date conversion offset 1 day (#22738)

[noreply] Set state integration test (#22935)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3ede5b76e48b41e89bc67541ea5044ebe704e905 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3ede5b76e48b41e89bc67541ea5044ebe704e905 # timeout=10
Commit message: "Set state integration test (#22935)"
 > git rev-list --no-walk 90baef11b6862e9f698df7ea888fe21dc69513e6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8585951424587115679.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0830191607 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/t3c5ebj4e7mqq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #811

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/811/display/redirect?page=changes>

Changes:

[noreply] Add set state in Go (#22919)

[noreply] Go Map State integration test (#22898)

[noreply] Add clear function for bag state types (#22917)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 90baef11b6862e9f698df7ea888fe21dc69513e6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 90baef11b6862e9f698df7ea888fe21dc69513e6 # timeout=10
Commit message: "Add clear function for bag state types (#22917)"
 > git rev-list --no-walk e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6857237950390374827.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0829150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ws3taovbd5czg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #810

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/810/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e9089dd99630d939f0c38fbacbe97a283e429fc2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
Commit message: "[BEAM-12164] Feat: Added support to Cloud Spanner Change Streams connector for including transaction tags in the Change Stream records (#22769)"
 > git rev-list --no-walk e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1618376271091599908.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0828150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/an4ptijbt7ue4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #809

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/809/display/redirect?page=changes>

Changes:

[noreply] Pass user specified destination type to UpdateSchemaDestination 

[noreply] [Go SDK] Stream decode values in single iterations (#22904)

[noreply] Enable autosharding for BQ: #22818

[noreply] Update wordcount_minimal.py by removing pipeline_args.extend (#22786)

[noreply] Add map state in the Go Sdk (#22897)

[noreply] [BEAM-12164] Feat: Added support to Cloud Spanner Change Streams


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e9089dd99630d939f0c38fbacbe97a283e429fc2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e9089dd99630d939f0c38fbacbe97a283e429fc2 # timeout=10
Commit message: "[BEAM-12164] Feat: Added support to Cloud Spanner Change Streams connector for including transaction tags in the Change Stream records (#22769)"
 > git rev-list --no-walk 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6360238782817446115.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0827150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mbyx6p4bygkmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #808

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/808/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-22723] Yield BatchElement batches at end of window.

[noreply] Update sdks/python/apache_beam/transforms/util_test.py

[noreply] [Website] add Python to KinesisIO in connectors #22845 (#22841)

[noreply] Combining state integration test (#22846)

[cushon] Update to Byte Buddy 1.12.14

[cushon] Add a regression test

[cushon] Add spotless exclusion

[noreply] Small lint fixes (#22890)

[noreply] Preserve state on SDK switch (#22430) (#22735)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8347b9e1d36cb8c2a1d863909d2d27a00a3efdaa # timeout=10
Commit message: "Merge pull request #22814 from cushon/bb"
 > git rev-list --no-walk 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8359479759866327575.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0826150401 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kmqbt7hooxaoc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #807

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/807/display/redirect?page=changes>

Changes:

[chamikaramj] Updates old releases to use archive.apache.org

[noreply] Fix a few linting issues (#22842)

[noreply] Add combining state support (#22826)

[noreply] Bump cloud.google.com/go/pubsub from 1.24.0 to 1.25.1 in /sdks (#22850)

[noreply] Bump google.golang.org/grpc from 1.48.0 to 1.49.0 in /sdks (#22838)

[noreply] [Website] update videos section (#22772)

[noreply] Update Dataflow fnapi_container-version (#22852)

[noreply] Go SDK Katas: Update beam module dependency (#22753)

[noreply] unskip sklearn IT test (#22825)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 42b1640a25d5dbdea08ae2feaa0d3e81f6278575 # timeout=10
Commit message: "unskip sklearn IT test (#22825)"
 > git rev-list --no-walk 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7689278238522169569.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0825150441 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fqazjfu3qftoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #806

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/806/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Eliminate some null errors and rawtypes from sdks/java/core

[Kiley Sok] Update Beam 2.41.0 release docs

[noreply] [Playground] Setup Datastore in Playground project using Terraform -

[noreply] Add bag state support (#22816)

[Kiley Sok] Fix dates for 2.41.0 release

[noreply] added link to setup instructions in WordCount example (#22832)

[noreply] Bump google.golang.org/api from 0.93.0 to 0.94.0 in /sdks (#22839)

[noreply] Bump cloud.google.com/go/bigquery from 1.38.0 to 1.39.0 in /sdks

[noreply] Add an integration test for bag state (#22827)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 702ce768f7b21d7bd10c0c3efd4e0719f2d03bad # timeout=10
Commit message: "Add an integration test for bag state (#22827)"
 > git rev-list --no-walk c7938faea948403ed33336cc99a6ae2afa9f5c32 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4212287221470250831.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0824150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rjl2ywiy2j6lw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #805

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/805/display/redirect?page=changes>

Changes:

[yathu] Evaluate proper metric in TextIOIT

[Andrew Pilloud] Add Python nexmark to gradle

[Michael Luckey] Align neo4j error messages with API

[noreply] E2E basic state support (#22798)

[noreply] Add state integration test (#22815)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c7938faea948403ed33336cc99a6ae2afa9f5c32 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c7938faea948403ed33336cc99a6ae2afa9f5c32 # timeout=10
Commit message: "Merge pull request #22740: Evaluate proper metric in TextIOIT"
 > git rev-list --no-walk dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1409649442438083384.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0823150415 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jyzhwzdbrddk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #804

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/804/display/redirect?page=changes>

Changes:

[noreply] Bump cloud.google.com/go/bigquery from 1.37.0 to 1.38.0 in /sdks

[noreply] Add Release category to release announcement blogs (#22785)

[noreply] [BEAM-13657] Update Python version used by mypy. (#22804)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dfa5ec58a192a35c20e3f54c9300fd611a98f7b0 # timeout=10
Commit message: "[BEAM-13657] Update Python version used by mypy. (#22804)"
 > git rev-list --no-walk f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8111814919932914703.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0822153953 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xixpwid4jntdi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #803

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/803/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f921a2f1996cf906d994a9d62aeb6978bab09dd5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
Commit message: "Fix lint issues (#22800)"
 > git rev-list --no-walk f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8593965944380473509.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0821150351 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/plhiqo2s3jh7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #802

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/802/display/redirect?page=changes>

Changes:

[noreply] Modify RunInference to return PipelineResult for the benchmark tests

[noreply] Fix lint issues (#22800)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f921a2f1996cf906d994a9d62aeb6978bab09dd5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f921a2f1996cf906d994a9d62aeb6978bab09dd5 # timeout=10
Commit message: "Fix lint issues (#22800)"
 > git rev-list --no-walk 7a469fd20ef198a38e1df6af081062904dd1cbbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8398984364719382607.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0820150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rmdvd7ylgqiis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #801

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/801/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add scroll to new position if anchor is present #22699

[randomstep] [BEAM-8701] bump commons-io to 2.7

[bulat.safiullin] [Website] remove text from Available contact channels table #22696

[bulat.safiullin] [Website] update commits link #22520

[cushon] Downgrade bytebuddy version to 1.11.0

[noreply] fixed column width in tables in Getting started from Spark guide

[noreply] Testing authentication for Playground (#22782)

[noreply] [BEAM-12776, fixes #21095] Limit parallel closes from the prior element

[noreply] [BEAM-13015, #21250] Reuse buffers when possible when writing on

[noreply] [Go SDK] Fix go lint errors (#22796)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7a469fd20ef198a38e1df6af081062904dd1cbbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7a469fd20ef198a38e1df6af081062904dd1cbbb # timeout=10
Commit message: "Merge pull request #22433: [BEAM-8701] bump commons-io to 2.7"
 > git rev-list --no-walk 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2976450536618237898.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0819150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins
> Task :buildSrc:check
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 25s
10 actionable tasks: 8 executed, 1 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/3jxu5k63643bu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #800

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/800/display/redirect?page=changes>

Changes:

[noreply] Fix direct running mode multi_processing on win32 (#22730)

[noreply] Improve error message on schema issues (#22469)

[noreply] sklearn runinference regression example (#22088)

[noreply] [Website] add intuit case-study, add intuit quote-card (#22757)

[noreply] Avoid panic on type assert. (#22767)

[noreply] [#21935] Reject ill formed GroupByKey coders during pipeline.run

[noreply] Don't use batch interface for single object operations (#22432)

[noreply] Label kata changes with the language they're modifying (#22764)

[noreply] [Website] Add GitHub issue link (#22774)

[noreply] Fix some typos in the ML doc (#22763)

[noreply] Go stateful DoFns user side changes (#22761)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 75eb0b1431c84c98f2e16a9f535b0e11b0160d43 # timeout=10
Commit message: "Go stateful DoFns user side changes (#22761)"
 > git rev-list --no-walk 60581e8b1b6e93889cce78542e99d1fea4105d54 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6404192359750385858.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0818150422 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2i2rtefbvtay6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #799

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/799/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015, #21250] Remove looking up thread local metrics container

[noreply] [fixes #22731] Publish nightly snapshot of legacy Dataflow worker jar.

[andyye333] Remove assert

[noreply] [fixes #22744] Update hadoop library patch versions to 2.10.2 and 3.2.4

[noreply] Update beam-master version for legacy (#22741)

[noreply] Bump google.golang.org/api from 0.92.0 to 0.93.0 in /sdks (#22752)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 60581e8b1b6e93889cce78542e99d1fea4105d54 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 60581e8b1b6e93889cce78542e99d1fea4105d54 # timeout=10
Commit message: "Bump google.golang.org/api from 0.92.0 to 0.93.0 in /sdks (#22752)"
 > git rev-list --no-walk 91c4b87aa95d89aac806ef374fda63637960bd6c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1328754146291886658.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0817150423 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rsjrnw5q6g47o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #798

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/798/display/redirect?page=changes>

Changes:

[Steve Niemitz] Fix UpdateSchemaDestination when source format is set to AVRO

[noreply] Add a dataflow override for runnerv1 to still use SDF on runnerv2.

[noreply] [Playground] Result filter bug (#22215)

[noreply] [Website] update case-studies layout (#22342)

[noreply] Implement KafkaSchemaTransformReadConfiguration (#22403)

[noreply] Handle single-precision float values in the standard coders tests


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 91c4b87aa95d89aac806ef374fda63637960bd6c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 91c4b87aa95d89aac806ef374fda63637960bd6c # timeout=10
Commit message: "Handle single-precision float values in the standard coders tests properly (#22716)"
 > git rev-list --no-walk 21584b132d23a30c60ec6d8da65f60b525cfd768 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9211931976431665544.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0816150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/i6h2dtybttuho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #797

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/797/display/redirect?page=changes>

Changes:

[noreply] fix minor unreachable code caused by log.Fatal (#22618)

[noreply] Attempt to fix SpannerIO test flakes (#22688)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 21584b132d23a30c60ec6d8da65f60b525cfd768 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 21584b132d23a30c60ec6d8da65f60b525cfd768 # timeout=10
Commit message: "Attempt to fix SpannerIO test flakes (#22688)"
 > git rev-list --no-walk 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4069278984046472960.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0815150401 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j52mnxtjecnjo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #796

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/796/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 184d8c59b34a70dac116517ac2791aeefa918bbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
Commit message: "Bump up python container versions (#22697)"
 > git rev-list --no-walk 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins557870123585846007.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0814150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h3k2sx524lcry

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #795

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/795/display/redirect?page=changes>

Changes:

[noreply] [Playground] [Backend] added validation for snippet endpoints to avoid

[noreply] Add GeneratedClassRowTypeConstraint (#22679)

[noreply] [Playground] [Backend] Removing unused snippets manually and using the

[noreply] Implement PubsubSchemaTransformWriteConfiguration (#22262)

[noreply] Add support for FLOAT to Python RowCoder (#22626)

[noreply] Bump up python container versions (#22697)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 184d8c59b34a70dac116517ac2791aeefa918bbb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 184d8c59b34a70dac116517ac2791aeefa918bbb # timeout=10
Commit message: "Bump up python container versions (#22697)"
 > git rev-list --no-walk 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7945912937181671329.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0813150412 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hm35nkrcbee24

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #794

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/794/display/redirect?page=changes>

Changes:

[yathu] Bump mongo_java_driver to 3.12.11 and embed.mongo to 3.0.0

[noreply] Fix seed job (#22687)

[noreply] Bump actions/stale from 3 to 5 (#22684)

[noreply] Bump actions/upload-artifact from 2 to 3 (#22682)

[noreply] Bump actions/download-artifact from 2 to 3 (#22683)

[noreply] Add shunts for Beam typehints (#22680)

[noreply] Fix wordcount setup-java (#22700)

[noreply] Bump google.golang.org/api from 0.91.0 to 0.92.0 in /sdks (#22681)

[bulat.safiullin] [Website] add container with overflow-x to runners with table #22708

[noreply] Bump cloud.google.com/go/storage from 1.24.0 to 1.25.0 in /sdks (#22705)

[noreply] [Go SDK]: Implement standalone single-precision float encoder (#22664)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7a9bb76fe9f4c167c1d125db9d2cff9a1a315149 # timeout=10
Commit message: "[Go SDK]: Implement standalone single-precision float encoder (#22664)"
 > git rev-list --no-walk cf9ea1f442636f781b9f449e953016bb39622781 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7902994283274065451.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0812150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 3s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lddqxzodplv3g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #793

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/793/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] update contribution content collapse

[noreply] Clean up checkstyle suppressions.xml (#22649)

[noreply] [Playground] [Infrastructure] format python code style (#22291)

[noreply] Minor: Add helpful names for parameterized dataframe.schemas_test

[noreply] [BEAM-14118, #21639] Use vendored gRPC 1.48.1 (#22628)

[Ismaël Mejía] Fix #22466 Add github actions dependency updates with dependabot

[noreply] Change Python PostCommits timeout (#22655)

[noreply] Revert "Persist ghprbPullId parameter in seed job (#22579)" (#22656)

[noreply] Bump actions/setup-java from 2 to 3 (#22666)

[noreply] Bump actions/labeler from 3 to 4 (#22670)

[noreply] Bump actions/setup-node from 2 to 3 (#22671)

[noreply] Bump actions/setup-go from 2 to 3 (#22669)

[noreply] Bump actions/setup-python from 2 to 4 (#22668)

[noreply] Bump actions/checkout from 2 to 3 (#22667)

[noreply] Fix broken link to Retry Policy blog (#22554)

[noreply] Include total in header of issue report (#22475)

[chamikaramj] Update vendored gRPC version for SpannerTransformRegistrarTest

[noreply] [Playground] Share any code feature frontend (#22477)

[noreply] Remove subprocess.PIPE usage by using a temp file (#22654)

[noreply] [#22647] Upgrade org.apache.samza to 1.6 (#22648)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cf9ea1f442636f781b9f449e953016bb39622781 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cf9ea1f442636f781b9f449e953016bb39622781 # timeout=10
Commit message: "[#22647] Upgrade org.apache.samza to 1.6 (#22648)"
 > git rev-list --no-walk fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1028733339240793798.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0811153615 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy and 1 incompatible Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gkbdmj4f3pi46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #792

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/792/display/redirect?page=changes>

Changes:

[108862444+oborysevych] removed VladMatyunin from beam collaborators

[anandinguva98] Add stdlib distutils while building the wheels

[noreply] Skip

[noreply] Persist ghprbPullId parameter in seed job (#22579)

[noreply] Adhoc: Fix logging in Spark runner to avoid unnecessary creation of

[noreply] Improve exception when requested error tag does not exist (#22401)

[noreply] Reimplement Pub/Sub Lite's I/O using UnboundedSource. (#22612)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fa9691fe2e95974e89fc5ff5ee572ca7bd52e1f2 # timeout=10
Commit message: "Reimplement Pub/Sub Lite's I/O using UnboundedSource. (#22612)"
 > git rev-list --no-walk d07bd6d2d7efe0b1da11b682b1fd88990186762d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7089110608641014132.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0809171004 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vxsc2rz3xwico

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #791

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/791/display/redirect?page=changes>

Changes:

[alexey.inkin] Fix retaining unsaved pipeline options (#22075)

[vlad.matyunin] modifed WithKeys Playground Example

[alexander.zhuravlev] [Playground] Removed banner from Playground header, deleted unused

[shivam] Add example for `Distinct` PTransform

[manitgupta] Fix bug in StructUtils

[noreply] [Playground][Backend][Bug]: Moving the initialization of properties file

[noreply] Bump cloud.google.com/go/bigquery from 1.36.0 to 1.37.0 in /sdks

[noreply] Minor: Clean up an assertion in schemas_test (#22613)

[noreply] Exclude testWithShardedKeyInGlobalWindow on streaming runner v1 (#22593)

[noreply] Pub/Sub Schema Transform Read Provider (#22145)

[noreply] Update BigQuery URI validation to allow more valid URIs through (#22452)

[noreply] Add units tests for SpannerIO (#22428)

[noreply] Bump google.golang.org/api from 0.90.0 to 0.91.0 in /sdks (#22568)

[noreply] Fix for #22631 KafkaIO considers readCommitted() as it would commit back

[noreply] [CdapIO] Add CdapIO dashboard in Grafana (#22641)

[noreply] Add information on how to take/close issues in the contribution guide.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d07bd6d2d7efe0b1da11b682b1fd88990186762d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d07bd6d2d7efe0b1da11b682b1fd88990186762d # timeout=10
Commit message: "Add information on how to take/close issues in the contribution guide. (#22640)"
 > git rev-list --no-walk 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3040995601504528755.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0809150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ztofmquypjlne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #790

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/790/display/redirect?page=changes>

Changes:

[noreply] Add PyDoc buttons to the top and bottom of the Machine Learning page


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1f2186de8eedd20c6d8d3ce31bdaa5334b5b23ea # timeout=10
Commit message: "Add PyDoc buttons to the top and bottom of the Machine Learning page (#22458)"
 > git rev-list --no-walk 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7451530249374269868.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0808153220 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kwcfiijfvrybi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #789

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/789/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 17fb9c0342064cd4375b0d7f2c37e12a175d03ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
Commit message: "Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)"
 > git rev-list --no-walk 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3431325426464779527.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0807150356 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dzyo6gatv3npm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #788

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/788/display/redirect?page=changes>

Changes:

[yathu] Moving misplaced CHANGES from template to 2.41.0

[noreply] Add Import transform to Go FhirIO (#22460)

[noreply] Allow unsafe triggers for python nexmark benchmarks (#22596)

[noreply] pubsublite: Fix max offset for computing backlog (#22585)

[noreply] Add support when writing to locked buckets by handling

[noreply] [BEAM-14118, #21639] Vendor gRPC 1.48.1 (#22607)

[noreply] [21894] Validates inference_args early (#22282)

[noreply] Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 17fb9c0342064cd4375b0d7f2c37e12a175d03ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 17fb9c0342064cd4375b0d7f2c37e12a175d03ef # timeout=10
Commit message: "Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)"
 > git rev-list --no-walk 6910d770b76d14558da4fee27b66601b4989440e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9126683278298291924.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0806150403 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/trt5ai22nsxp2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #787

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/787/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #22347: [22188]Set allowed timestamp skew

[noreply] Added experimental annotation to fixes #22564 (#22565)

[noreply] [BEAM-14117] Delete vendored bytebuddy gradle build (#22594)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6910d770b76d14558da4fee27b66601b4989440e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6910d770b76d14558da4fee27b66601b4989440e # timeout=10
Commit message: "[BEAM-14117] Delete vendored bytebuddy gradle build (#22594)"
 > git rev-list --no-walk 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2124180027231551407.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0805150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 24s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/krttbt5efqcwi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #786

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/786/display/redirect?page=changes>

Changes:

[noreply] Update run_inference_basic.ipynb

[noreply] Update CHANGE.md after 2.41.0 cut (#22577)

[noreply] Convert to BeamSchema type from ReadfromBQ (#17159)

[noreply] Fix deleteTimer in InMemoryTimerInternals and enable VR tests for

[noreply] Update Dataflow container version (#22580)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1a42618b153b7c985c537f4eaa6ab01e3e2e1d11 # timeout=10
Commit message: "Update Dataflow container version (#22580)"
 > git rev-list --no-walk bf39489b2a1fd45e6798483d083e4ad240f66891 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1464535940740890707.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0804150409 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/x5yuasko43vvy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #785

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/785/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] add zstd compression support according to issue 22393

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Remove normalization in Pytorch Image Segmentation example (#22371)

[chamikaramj] Mention Java RunInference support in the Website

[noreply] Downgrade less informative logs during write to files (#22273)

[noreply] Beam ml notebooks (#22510)

[noreply] Add clearer error message for xlang transforms on teh Go Direct Runner

[noreply] [CdapIO] Add integration tests for CdapIO (Batch) (#22313)

[noreply] Bugfix: Fix broken assertion in PipelineTest (#22485)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bf39489b2a1fd45e6798483d083e4ad240f66891 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bf39489b2a1fd45e6798483d083e4ad240f66891 # timeout=10
Commit message: "Merge pull request #22557: Mention Java RunInference support in the Website"
 > git rev-list --no-walk 48513adc665c32b32f50ff123bb18b66ca302934 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7224794548040047544.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0803150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/etrsotyrw7fha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #784

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/784/display/redirect?page=changes>

Changes:

[noreply] Exclude grpcio==1.48.0 (#22539)

[noreply] Merge PR #22304 fixing #22331 fixing JDBC IO IT

[noreply] Update pytest to support Python 3.10 (#22055)

[noreply] Update the imprecise link. (#22549)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 48513adc665c32b32f50ff123bb18b66ca302934 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 48513adc665c32b32f50ff123bb18b66ca302934 # timeout=10
Commit message: "Update the imprecise link. (#22549)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1570776885633175696.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0802152014 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/epdwcvkxxn44k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #783

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/783/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8072414429117372895.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0801150421 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2n74wkohqhy5g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #782

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/782/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7656613693184781691.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0731150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/o5q37vtyhm5fy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #781

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/781/display/redirect?page=changes>

Changes:

[noreply] Change _build import from setuptools to distutils (#22503)

[noreply] Remove stringx package (#22534)

[noreply] Improve concrete error message (#22536)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e5e3cb25ca4fc2e31c10eb3dbda8289c6bfc7140 # timeout=10
Commit message: "Improve concrete error message (#22536)"
 > git rev-list --no-walk f4bd7b7236fdf4ca8068d8c42c6c7023646c015d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1811837316239827977.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0730150359 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ijpyj32mavppg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #780

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/780/display/redirect?page=changes>

Changes:

[chamikaramj] Remove unnecessary reference to use_runner_v2 experiment in x-lang

[yixiaoshen] Fix typo in Datastore V1ReadIT test

[noreply] Relax the google-api-core dependency. (#22513)

[noreply] Bump google.golang.org/protobuf from 1.28.0 to 1.28.1 in /sdks (#22517)

[noreply] Bump google.golang.org/api from 0.89.0 to 0.90.0 in /sdks (#22518)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f4bd7b7236fdf4ca8068d8c42c6c7023646c015d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f4bd7b7236fdf4ca8068d8c42c6c7023646c015d # timeout=10
Commit message: "Bump google.golang.org/api from 0.89.0 to 0.90.0 in /sdks (#22518)"
 > git rev-list --no-walk c6624c36cbbbc94f78ab1fd4660efd8132fa1952 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2865740341833837237.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0729150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ih5siw4r67grm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #779

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/779/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] remove beam-summit 2022 container with all related files

[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read

[noreply] Upgrades pip before installing Beam for Python default expansion service

[noreply] [Go SDK]: Plumb allowed lateness to execution (#22476)

[Valentyn Tymofieiev] Restrict google-api-core

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Replace distutils with supported modules. (#22456)

[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)

[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.

[noreply] Remove stripping of step name. Replace removing only suffix step name

[noreply] Add read/write PubSub integration example fhirio pipeline (#22306)

[noreply] Remove deprecated Session runner (#22505)

[noreply] Add Go test status to the PR template (#22508)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c6624c36cbbbc94f78ab1fd4660efd8132fa1952 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c6624c36cbbbc94f78ab1fd4660efd8132fa1952 # timeout=10
Commit message: "Add Go test status to the PR template (#22508)"
 > git rev-list --no-walk 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1487969989776985474.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0728150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6pw4zjsoqcayw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #778

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/778/display/redirect?page=changes>

Changes:

[chamikaramj] Adds KV support for the Java RunInference transform.

[noreply] Replace distutils with supported modules. (#21968)

[noreply] Revert "Replace distutils with supported modules. " (#22453)

[noreply] Enable configuration to avoid successfully written Table Row propagation

[noreply] lint fixes for recent import (#22455)

[noreply] Bump Python Combine LoadTests timeout to 12 hours (#22439)

[noreply] convert windmill min timestamp to beam min timestamp (#21915)

[noreply] [CdapIO] Fixed necessary warnings (#22399)

[noreply] [#22051]: Add read_time support to Google Cloud Datastore connector

[noreply] 21730 fix offset resetting (#22450)

[noreply] Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0760f13c4a5ca1dcfa0e2fad7d875e2d2f050963 # timeout=10
Commit message: "Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)"
 > git rev-list --no-walk 5141ad8790a57e2fa62af607f32736e3eed399e3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6808241897335409012.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0727150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uenjw4fqrxmaa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #777

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/777/display/redirect?page=changes>

Changes:

[Steve Niemitz] Fix overly aggressive null check in RowWriterFactory

[bulat.safiullin] add executeAsTemplate to head, head_homepage, add absURL to page-nav.js,

[noreply] Bump cloud.google.com/go/bigquery from 1.35.0 to 1.36.0 in /sdks

[noreply] Disallow EventTimes in iterators (#22435)

[noreply] Update the upper bound for google-cloud-recommendations-ai. (#22398)

[noreply] LoadTestsBuilder: Disallow whitespace in option values (#22437)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5141ad8790a57e2fa62af607f32736e3eed399e3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5141ad8790a57e2fa62af607f32736e3eed399e3 # timeout=10
Commit message: "Merge pull request #21949: [WEBSITE] fix relative paths bug on staging in js files"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins700630480883230391.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0726150417 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7sxtxcjn3hrcu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #776

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/776/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1023090805032605811.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0725150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/iukbw53ol3qhw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #775

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/775/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1843953169948731700.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0724150410 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/og7dqrp5mqoyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #774

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/774/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] change getting window width method

[noreply] Bump cloud.google.com/go/storage from 1.23.0 to 1.24.0 in /sdks (#22377)

[Pablo Estrada] Removing experimental annotation from JdbcIO

[noreply] Drop timeseries:postCommit dependency (#22414)

[noreply] Deduplicate identical environments in a pipeline. (#22308)

[noreply] Skip failing torch post commit test (#22418)

[noreply] Log level fix on local runner (#22420)

[noreply] Update element_type inference (default_type_hints) for batched DoFns

[noreply] Remove spaces in experiments (#22423)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54b0784da7ccba738deff22bd83fbc374ad21d2e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54b0784da7ccba738deff22bd83fbc374ad21d2e # timeout=10
Commit message: "Remove spaces in experiments (#22423)"
 > git rev-list --no-walk b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6289864065729569260.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0723150409 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v7fwgorqkoegc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #773

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/773/display/redirect?page=changes>

Changes:

[balazs.nemeth] BEAM-14525 Fix for Protobuf getter/setter method name discovery issue

[balazs.nemeth] BEAM-14525 Added a proto message with the problematic properties to use

[balazs.nemeth] PR CR: updating issue links

[noreply] Add accept-language header for MPL license (#22395)

[noreply] Bump terser from 5.9.0 to 5.14.2 in

[noreply] Fixes #22156: Fix Spark3 runner to compile against Spark 3.2/3.3 and add

[Moritz Mack] Closes #22407: Separate sources for SparkStructuredStreamingRunner for

[Moritz Mack] Add deprecation warning for Spark 2 in SparkStructuredStreamingRunner


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b9f6af54d52428dcff910f9fa8b01fa0d474f5e0 # timeout=10
Commit message: "Merge pull request #22408 from mosche/22407-separate-spark-ssrunner-sources"
 > git rev-list --no-walk 50346b5d1414f671a60f117e0f50a0c16172afb7 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5568699033481057182.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0722150417 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dmc6fexp5f7lo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #772

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/772/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Support combiner lifting.

[noreply] Bump google.golang.org/api from 0.87.0 to 0.88.0 in /sdks (#22350)

[Robert Bradshaw] More clarification.

[noreply] [CdapIO] HasOffset interface was implemented (#22193)

[noreply] added olehborysevych as collaborator (#22391)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 50346b5d1414f671a60f117e0f50a0c16172afb7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 50346b5d1414f671a60f117e0f50a0c16172afb7 # timeout=10
Commit message: "added olehborysevych as collaborator (#22391)"
 > git rev-list --no-walk 4821e035c148df1ed7eb9e7054e47fe2a7003a1f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6040133883200477665.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0721150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zt73satenzsr4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #771

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/771/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Require unique names for stages.

[noreply] cleaned up types in standard_coders.ts (#22316)

[noreply] JMH module for sdks:java:core with benchmarks for

[noreply] Bump cloud.google.com/go/pubsub from 1.23.1 to 1.24.0 in /sdks (#22332)

[Luke Cwik] [#22181] Fix java package for SDK java core benchmark

[Luke Cwik] Allow jmhTest to run concurrently with other jmhTest instances

[noreply] [BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4821e035c148df1ed7eb9e7054e47fe2a7003a1f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4821e035c148df1ed7eb9e7054e47fe2a7003a1f # timeout=10
Commit message: "[BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)"
 > git rev-list --no-walk efde3f174c7ac502b24116d308249af52db52a2c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5505139809395329978.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0720150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rr7mkgawmou6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #770

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/770/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14117] Unvendor bytebuddy dependency (#17317)

[noreply] Use npm ci instead of install in CI (#22323)

[noreply] Fix typo in use_single_core_per_container logic. (#22318)

[noreply] [#22319] Regenerate proto2_coder_test_messages_pb2.py manually (#22320)

[noreply] Add links to the new RunInference content to Learning Resources (#22325)

[noreply] Unskip RunInference IT tests (#22324)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision efde3f174c7ac502b24116d308249af52db52a2c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f efde3f174c7ac502b24116d308249af52db52a2c # timeout=10
Commit message: "Unskip RunInference IT tests (#22324)"
 > git rev-list --no-walk 799eed2cc38ed6319d7b54a3ee0114c539d0f0af # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1118994328673007428.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0719150356 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2mlhnzlb7qiie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #769

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/769/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [website] Add TPC-DS benchmark documentation

[noreply] Increase streaming server timeout  (#22280)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 799eed2cc38ed6319d7b54a3ee0114c539d0f0af (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 799eed2cc38ed6319d7b54a3ee0114c539d0f0af # timeout=10
Commit message: "Merge pull request #22047: [website] Add TPC-DS benchmark documentation"
 > git rev-list --no-walk 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4603993909006408682.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0718150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w62ovapgst73u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #768

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/768/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
Commit message: "Merge pull request #22259 from akvelon/pg-trigger-deploy-examples"
 > git rev-list --no-walk 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins648330945179033794.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0717150354 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xkn2247pyrdjo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #767

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/767/display/redirect?page=changes>

Changes:

[noreply] Bump protobufjs from 6.11.2 to 6.11.3 in /sdks/typescript

[vlad.matyunin] enabled multifile flag for multifile examples (PG)

[Robert Bradshaw] Don't try to parse non-flags as retained pipeline options.

[chamikaramj] Enables UnboundedSource wrapped SDF Kafka source by default for x-lang

[noreply] Merge pull request #22140 from [Playground Task] Sharing any code API

[bulat.safiullin] [Website] add playground section, update playground, update get-started

[noreply] RunInference documentation updates. (#22236)

[noreply] Turn pr bot on for remaining common labels (#22257)

[noreply] Reviewing the RunInference ReadMe file for clarity. (#22069)

[noreply] Collect heap profile on OOM on Dataflow (#22225)

[noreply] fixing the missing wrap around ring range read (#21786)

[noreply] Update RunInference documentation (#22250)

[noreply] Rewrote Java multi-language pipeline quickstart (#22263)

[noreply] Merge pull request #22300 from Fixed [Playground] DeployExamples,


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 20274f35c9fd11f4d815c8d1d88df2ad874dfa3e # timeout=10
Commit message: "Merge pull request #22259 from akvelon/pg-trigger-deploy-examples"
 > git rev-list --no-walk 673a4cc793036050596aa340d91f26b461cb88e5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1408643837133180674.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0716150406 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4memxzpklbbls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #766

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/766/display/redirect?page=changes>

Changes:

[vitaly.terentyev] [BEAM-14101] Add Spark Receiver IO package and ReceiverBuilder

[egalpin] Moves timestamp skew override to correct place

[egalpin] Adds TestStream to verify window preservation of ElasticsearchIO#write

[egalpin] Removes unnecessary line

[Heejong Lee] [BEAM-22229] Override external SDK container URLs for Dataflow by

[egalpin] Adds validation that ES#Write outputs are in expected windows

[egalpin] Updates window verification test to assert the exact docs in the window

[egalpin] Uses guava Iterables over shaded avro version

[danthev] Fix query retry in Java FirestoreIO.

[noreply] Pg auth test (#22277)

[noreply] [BEAM-14073] [CdapIO] CDAP IO for batch plugins: Read, Write. Unit tests

[Heejong Lee] update

[noreply] [Fix #22151] Add fhirio.Deidentify transform (#22152)

[noreply] Remove locks around ExecutionStateSampler (#22190)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 673a4cc793036050596aa340d91f26b461cb88e5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 673a4cc793036050596aa340d91f26b461cb88e5 # timeout=10
Commit message: "Merge pull request #22183 from egalpin/egalpin/timestamp-skew-es"
 > git rev-list --no-walk 67e6726ffeb47d2ada0122369fa230833ce0f026 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4326759813939225156.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0715150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wlcbeweytc7y2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #765

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/765/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14506] Adding testcases and examples for xlang Python RunInference

[Heejong Lee] update

[Heejong Lee] update

[noreply] Move youngoli to the reviewer exclusion list (#22195)

[noreply] Bump google.golang.org/api from 0.86.0 to 0.87.0 in /sdks (#22253)

[noreply] Bump cloud.google.com/go/bigquery from 1.34.1 to 1.35.0 in /sdks

[noreply] Bump google.golang.org/grpc from 1.47.0 to 1.48.0 in /sdks (#22252)

[noreply] Merge pull request #15786: Add gap-filling transform for timeseries

[chamikaramj] Adds an experiment that allows opting into using Kafka SDF-wrapper

[noreply] Defocus iframe on blur or mouseout (#22153) (#22154)

[noreply] Fix pydoc rendering for annotated classes (#22121)

[noreply] Fix typo in comment (#22266)

[noreply] Split words on new lines or spaces (#22270)

[noreply] Replace \r\n, not just \n


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 67e6726ffeb47d2ada0122369fa230833ce0f026 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 67e6726ffeb47d2ada0122369fa230833ce0f026 # timeout=10
Commit message: "Replace \r\n, not just \n"
 > git rev-list --no-walk fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5568788148525185145.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0714150425 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xmpbwqdisdgrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #764

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/764/display/redirect?page=changes>

Changes:

[naireenhussain] add new pubsub urn

[Pablo Estrada] Several requests to show experiments in Dataflow UI

[byronellis] Add org.pentaho to calcite relocated packages to fix vendoring

[noreply] Adding VladMatyunin as collaborator (#22239)

[noreply] Mark session runner as deprecated (#22242)

[noreply] Update google-cloud-core dependency to <3 (#22237)

[noreply] Move WC integration test to generic registration (#22248)

[noreply] Move Xlang Go examples to generic registration (#22249)

[noreply] Move Go Primitives Integration Tests to Generic Registration (#22247)

[noreply] Move native Go examples to generic registration (#22245)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fa5bcfa36137f9ba93dcd3c2a7b23be061edb065 # timeout=10
Commit message: "Move native Go examples to generic registration (#22245)"
 > git rev-list --no-walk a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9007469392348148553.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0713150359 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/k45gbekyxc4yc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #763

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/763/display/redirect?page=changes>

Changes:

[noreply] Split checkStyle from precommit into spotless job (#22203)

[noreply] Allow one to bound the size of output shards when writing to files.

[noreply] Bump moment from 2.29.2 to 2.29.4 in

[noreply] Allow BigQuery TableIds to have space in between (#22167)

[noreply] Use async as a suffix rather than a prefix for asynchronous variants.

[noreply] Override log levels after log handler is created (#22191)

[noreply] Remove deprecated unused option in seed job script (#22223)

[noreply] Better error for external BigQuery tables. (#22178)

[noreply] Try to fix playground workflow (#22226)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a8775f0a4ac13fea440dc6e4b18f1bd5f821fcaf # timeout=10
Commit message: "Try to fix playground workflow (#22226)"
 > git rev-list --no-walk 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5877738462245646909.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0712150510 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ccoryyrzh2asw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #762

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/762/display/redirect?page=changes>

Changes:

[noreply] Parallelizable DataFrame/Series mean (#22174)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 262f2b7f91ac879cb8921a3e7d59d0315c9df9c4 # timeout=10
Commit message: "Parallelizable DataFrame/Series mean (#22174)"
 > git rev-list --no-walk 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins374731059456405510.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cbpfed4p75ijq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #761

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/761/display/redirect?page=changes>

Changes:

[noreply] Add typescript documentation to the programing guide. (#22137)

[noreply] [Website] Update minimum required Go version for sdk development


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9fb8be0e3d9a44109024fb9b3c57c3997ec33a3d # timeout=10
Commit message: "[Website] Update minimum required Go version for sdk development (#22210)"
 > git rev-list --no-walk 4b4077dc8828452e6a49b1bc00db2fa551e453fb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6236056406817245835.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hb76csa3a6iw2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #760

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/760/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] change case studies link from staging to relative path

[bulat.safiullin] [Website] add I/O Connectors link to dropdown list, updating link to

[noreply] Merge pull request #22096 from [Playground] Infrastructure for sharing

[noreply] Support dependencies and remote registration in the typescript SDK.

[noreply] [BEAM-13015, #22050] Make SDK harness msec counters faster using ordered

[yathu] Fix build error due to dep confliction of google-cloud-bigquery-storage

[yathu] Fix atomicwrites old version purge on pypi

[noreply] Fix default type inference of CombinePerKey. (#16351)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b4077dc8828452e6a49b1bc00db2fa551e453fb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b4077dc8828452e6a49b1bc00db2fa551e453fb # timeout=10
Commit message: "Merge pull request #22205 Fix build error due to dep confliction of google-cloud-bigquery-storage and google-cloud-core"
 > git rev-list --no-walk d44c0440bc91f8fd63dcd082c2acf50b40e7af1b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5127195298608250411.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zfv37czrpgf7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #759

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/759/display/redirect?page=changes>

Changes:

[bulat.safiullin] [Website] add refresh to page-nav.js

[relax] set timestamp when outputting finalize element

[alexey.inkin] Declarative theming, Remove duplicate PlaygroundState for embedded page,

[yathu] Fix Hadoop upload corrupted due to buffer reuse

[benjamin.gonzalez] Fix testKafkaIOReadsAndWritesCorrectlyInStreaming failing for kafka

[noreply] Add `schema_options` and `field_options` on RowTypeConstraint (#22133)

[noreply] Optimize locking in several critical-path methods (#22162)

[noreply] Deprecate AWS IOs (Java) using AWS SDK v1 in favor of IOs in

[noreply] Update Go BPG xlang documentation to include Java automated service


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d44c0440bc91f8fd63dcd082c2acf50b40e7af1b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d44c0440bc91f8fd63dcd082c2acf50b40e7af1b # timeout=10
Commit message: "Update Go BPG xlang documentation to include Java automated service start-up (#22187)"
 > git rev-list --no-walk df162c1e2fb221c64cd861605fb35b37d2e6b8ec # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5519208075142530380.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qiwzuygmt4i4u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #758

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/758/display/redirect?page=changes>

Changes:

[noreply] Enable passing tests on dataflow runner v2. (#22136)

[noreply] Merge pull request #17727 from [BEAM-9482] Fix "provided port is already

[noreply] Fix date for go 2.40 blog post

[noreply] Fix month for 2.40 go blog post

[noreply] [BEAM-14545] Optimize copies in dataflow v1 shuffle reader. (#17802)

[noreply] Tune StreamingModeExecutionContext allocations (#22142)

[noreply] [BEAM-3221] Improve documentation around split request and response

[noreply] Fix documentation about hand implemented global aggregations (#22173)

[noreply] Merge pull request #21872 from Standardizing output of WriteToBigQuery

[noreply] Propogate error messages from GcsUtil (#22079)

[noreply] Reenable Jenkins comment triggers (#22169)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision df162c1e2fb221c64cd861605fb35b37d2e6b8ec (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f df162c1e2fb221c64cd861605fb35b37d2e6b8ec # timeout=10
Commit message: "Reenable Jenkins comment triggers (#22169)"
 > git rev-list --no-walk 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2838735083460005702.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706185554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ipvnewhrxaydq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #757

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/757/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11103] Add blog post for go 2.40 release (#17723)

[noreply] Fix test_row_coder_fail_early_bad_schema fails run after

[noreply] Tune ByteStringCoder allocations (#22144)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6dea0d15d0a97d243a2fe56684c2e193cbea14d2 # timeout=10
Commit message: "Tune ByteStringCoder allocations (#22144)"
 > git rev-list --no-walk b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins184951323795895140.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0706150414 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yniw45ai7oefk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #756

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/756/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
Commit message: "Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)"
 > git rev-list --no-walk b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8600529995593285418.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0705150404 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gwtgcon5sbt4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #755

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/755/display/redirect?page=changes>

Changes:

[noreply] Go SDK: Update memfs to parse the List() pattern as a glob, not a regexp

[noreply] Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b53b16f1fb41913b0e8bbe9d64d71b8e3ebfbbf6 # timeout=10
Commit message: "Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)"
 > git rev-list --no-walk 85e8149cbcebc4a6b07d09501f96dfaec95c73bc # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6621171985168136527.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0704150407 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mp2yzbwihx5oq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #754

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/754/display/redirect?page=changes>

Changes:

[noreply] Sharding IO tests(amazon web services and amazon web services 2) from


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 85e8149cbcebc4a6b07d09501f96dfaec95c73bc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 85e8149cbcebc4a6b07d09501f96dfaec95c73bc # timeout=10
Commit message: "Sharding IO tests(amazon web services and amazon web services 2) from java post commit task (#21808)"
 > git rev-list --no-walk eb5b7cc256d8d15173475cf51af758979a33bd16 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2341324610393051310.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0703150408 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/e2c2xbt4mcye2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #753

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/753/display/redirect?page=changes>

Changes:

[noreply] Python: Use RowTypeConstraint for normalizing all schema-inferrable user

[noreply] changing nameBase value to Java_GCP_IO_Direct (#22128)

[noreply] Bump dataflow fnapi java sdk version (#22127)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision eb5b7cc256d8d15173475cf51af758979a33bd16 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f eb5b7cc256d8d15173475cf51af758979a33bd16 # timeout=10
Commit message: "Bump dataflow fnapi java sdk version (#22127)"
 > git rev-list --no-walk 680ed5b3a49990e2de0730b49233dfe22cfe9b8f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1910419592067663947.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0702150411 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rwbkvms46kzv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #752

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/752/display/redirect?page=changes>

Changes:

[alexey.inkin] Do not re-create PlaygroundState (#21950)

[Moritz Mack] Deprecate runner support for Spark 2.4 (closes #22094)

[noreply] Fixes #21698: Use normal Container snapshots for Go Load Tests (#22102)

[noreply] Change default, options, and explanation for issue priority (#22116)

[noreply] Minor: Bump flake8 to 4.0.1 (#22110)

[noreply] Add sdk_harness_log_level_overrides option for python sdk (#22077)

[noreply] Fix typo in Pytorch Bert Language Modeling (#22114)

[noreply] Fix #21977: Add Search transform to Go FhirIO (#21979)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 680ed5b3a49990e2de0730b49233dfe22cfe9b8f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 680ed5b3a49990e2de0730b49233dfe22cfe9b8f # timeout=10
Commit message: "Merge pull request #22097 from mosche/22094-DeprecateSpark2"
 > git rev-list --no-walk dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4379013325862009552.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0701150419 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sgjwzorm4ocwk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #751

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/751/display/redirect?page=changes>

Changes:

[alexey.inkin] Add an abstract layer for analytics, fix logging change of snippet, fix

[bulat.safiullin] [Website] add scroll-spy to body in case-studies/baseof.html

[noreply] [BEAM-6597] Replace ProgressRequestCallback with BundleProgressReporter

[noreply] [Go SDK] Go Lint fixes  (#21967)

[noreply] Fix #21869: Close GRPC connections on cancel (#21874)

[noreply] Add FlatMap(<builtin>) known issue to 2.40.0 blog (#22101)

[noreply] [BEAM-14347] Update docs to prefer generic registration functions

[Andrew Pilloud] Projection Pushdown optimizer on by default

[noreply] Merge pull request #21752 from Feature/beam 13852 reimplement with

[noreply] Change wording of Pytorch LM example (#22099)

[noreply] Fix missing model_params in Pytorch docstring  (#22100)

[noreply] Test and fix FlatMap(<builtin>) issue (#22104)

[noreply] Fix InputStream on platform with 4 bytie long (#22107)

[noreply] [BEAM-14187] Fix NPE at initializeForKeyedRead in IsmReaderImpl (#22111)

[noreply] Remove unused legacy dataflow translate code (#22019)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dd813a7f7352c077cc1c433ffe2bfe05f22d4b8d # timeout=10
Commit message: "Remove unused legacy dataflow translate code (#22019)"
 > git rev-list --no-walk 340b4217639753e7b16dedce29916491644a6c82 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8376766846722049186.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0630154742 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p52mn5uq2fbdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #750

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/750/display/redirect?page=changes>

Changes:

[damondouglas] Implement PubsubSchemaTransformMessageToFactory

[noreply] sharding GCP IO tests from the javaPostCommit task (#21800)

[noreply] Bump cloud.google.com/go/storage from 1.22.1 to 1.23.0 in /sdks (#22038)

[noreply] Followup sharding javaPostCommit (#22081)

[noreply] remove mention of dill in release notes as it's not relevant. (#22087)

[noreply] [#21634] Add comments on FieldValueGetter. (#21982)

[noreply] Bump google.golang.org/api from 0.85.0 to 0.86.0 in /sdks (#22092)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 340b4217639753e7b16dedce29916491644a6c82 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 340b4217639753e7b16dedce29916491644a6c82 # timeout=10
Commit message: "Bump google.golang.org/api from 0.85.0 to 0.86.0 in /sdks (#22092)"
 > git rev-list --no-walk a3b3182e38fe6b2152f371d4232ddc5d22feed71 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5850094982069037567.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0629153504 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 4 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/laed2rjm4df6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #749

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/749/display/redirect?page=changes>

Changes:

[Pablo Estrada] Blog post and updates for release 2.40.0

[noreply] 22011 remove checks on client.close() except when

[noreply] update flutter version to 3.0.1-stable (#22062)

[noreply] Add randomness to integration test job names to avoid collisions

[noreply] Give @pcoet triage permission (#22068)

[noreply] Issue#20877 Updated Interactive Beam README (#22034)

[noreply] Update issue bot to javascript and add label management (#22067)

[noreply] Clean up issue management doc page

[noreply] [BEAM-13015, #21250, fixes #22053] Improve PCollectionConsumerRegistry


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a3b3182e38fe6b2152f371d4232ddc5d22feed71 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a3b3182e38fe6b2152f371d4232ddc5d22feed71 # timeout=10
Commit message: "[BEAM-13015, #21250, fixes #22053] Improve PCollectionConsumerRegistry performance by swapping element count and sampled byte size to use a faster counter. (#22002)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4563956605503559421.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0628150946 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ovngilfrl2xgk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #748

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/748/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2168118793752211353.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0627151135 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/drmyc2zf3mrjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #747

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/747/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2840747005718017721.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0626150939 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jyjbynp4oyo7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #746

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/746/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Use WindowedValue.withValue rather than WindowedValue.of in

[Robert Bradshaw] [BEAM-14464] More efficient grouping keys in precombiner table.

[Robert Bradshaw] fix compile after merge

[Robert Bradshaw] spotless

[Robert Bradshaw] Only flush every Nth element.

[Robert Bradshaw] spotless

[Robert Bradshaw] Post-merge fix.

[Robert Bradshaw] Fix test expectations.

[bulat.safiullin] [Website] add guard expressions to fix-menu and page-nav

[noreply] Unify to a single issue report (#22045)

[noreply] Remove colon in issue report

[noreply] Bump cloud.google.com/go/pubsub from 1.22.2 to 1.23.0 in /sdks (#22036)

[noreply] Fix vendored dependency issue and other style checks (#22046)

[noreply] Bump shell-quote (#21983)

[noreply] Revert "[BEAM-13590]Update Pytest version to support Python 3.10

[noreply] Bump cloud.google.com/go/bigquery from 1.32.0 to 1.34.1 in /sdks

[noreply] Bump github.com/spf13/cobra from 1.4.0 to 1.5.0 in /sdks (#21955)

[yathu] checkStlye Fix: remove redundant static and public in interface. camel

[noreply] Fix DEADLINE_EXCEEDED flakiness  (#22035)

[noreply] Fix SpannerIO flakes (#22023)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7ad4864b0cb19b6c8405265f84fff24bf5b2c8b3 # timeout=10
Commit message: "Fix SpannerIO flakes (#22023)"
 > git rev-list --no-walk 10dab960d9695266fbbbeb040a378550fb440be6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8356857408503159363.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0625150913 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rwukxvpxalkgk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #745

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/745/display/redirect?page=changes>

Changes:

[noreply] Canonicalize standard_coders.yaml booleans

[noreply] Followup fix FileIOTest.testMatchWatchForNewFiles flaky (#21877)

[noreply] Fix links for issue report (#22033)

[noreply] Merge pull request #21953 from Implement

[noreply] Enable close issue as not planned (#22032)

[noreply] Rename README.md to ACTIONS.md (#22043)

[noreply] Removes examples of unscalable sinks from documentation. (#22020)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 10dab960d9695266fbbbeb040a378550fb440be6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 10dab960d9695266fbbbeb040a378550fb440be6 # timeout=10
Commit message: "Removes examples of unscalable sinks from documentation. (#22020)"
 > git rev-list --no-walk dc0b5e40417ad6c63890fef89d770a0606ce7282 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8434804929758533913.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0624151040 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/scow6vy43xyxe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #744

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/744/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Streaming-related runner fixes.

[Robert Bradshaw] Improvements to auto-started services.

[Robert Bradshaw] Fix version, asserts for remote execution.

[Robert Bradshaw] Add IO dependencies.

[Robert Bradshaw] Add several cross-language IOs.

[Robert Bradshaw] Disable tests that require new release is required for out-of-the-box

[rszper] Correcting the regex for the Dataflow job name.

[noreply] Merge pull request #21981 from [Playground] Upgrade Flutter linter, fix

[andyye333] Move wrapper class outside run()

[noreply] Clean up redundant articles, prepositions, conjunctions appeared

[noreply] Fix FlatMap numpy array bug (#22006)

[Robert Bradshaw] More strongly typed outputs.

[noreply] Fix issues with test ordering (#21986)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision dc0b5e40417ad6c63890fef89d770a0606ce7282 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f dc0b5e40417ad6c63890fef89d770a0606ce7282 # timeout=10
Commit message: "Fix issues with test ordering (#21986)"
 > git rev-list --no-walk 242f8f3ffe4802bce130403690241fcab0bd7281 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2093238737971764951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0623150957 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5ygezskbqmfqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #743

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/743/display/redirect?page=changes>

Changes:

[yiru] fix: Add a retry code to insertall retry policy

[johnjcasey] 21742 add warning for risky kafka configuration

[johnjcasey] 21742 run spotless

[noreply] Fix target email for flaky test/p0/p1 reports

[noreply] Add unit testing for graphx/user.go (#21962)

[bulat.safiullin] [Website] add lyft to quote cards on homepage, use relative paths for

[noreply] Update documentations and document generation (#21965)

[noreply] Add ExecuteBundles transform to Go FhirIO (#21840)

[noreply] Bump cloud.google.com/go/datastore from 1.6.0 to 1.8.0 in /sdks (#21973)

[noreply] Bump google.golang.org/api from 0.83.0 to 0.85.0 in /sdks (#21974)

[noreply] [Go SDK] Adds a snippet for GBK in BPG (#21842)

[noreply] Update parameterized requirement in /sdks/python (#21975)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 242f8f3ffe4802bce130403690241fcab0bd7281 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 242f8f3ffe4802bce130403690241fcab0bd7281 # timeout=10
Commit message: "Update parameterized requirement in /sdks/python (#21975)"
 > git rev-list --no-walk 75cba1085a3f6f069d78096f3a3eb95076129525 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins560785206642379670.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0622150950 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ma73lmbx227q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #742

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/742/display/redirect?page=changes>

Changes:

[noreply] Modified KafkaIO.Read SDF->Legacy forced override to fail if configured

[noreply] [BEAM-13590]Update Pytest version to support Python 3.10 (#17791)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 75cba1085a3f6f069d78096f3a3eb95076129525 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 75cba1085a3f6f069d78096f3a3eb95076129525 # timeout=10
Commit message: "[BEAM-13590]Update Pytest version to support Python 3.10 (#17791)"
 > git rev-list --no-walk 0ef5d3a185c1420da118208353ceb0b40b3a27c9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8362676324261311956.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0621152505 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zuld3cmplwnyu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #741

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/741/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Flink job


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0ef5d3a185c1420da118208353ceb0b40b3a27c9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0ef5d3a185c1420da118208353ceb0b40b3a27c9 # timeout=10
Commit message: "Merge pull request #21747: [BEAM-12918] Add PostCommit_Java_Tpcds_Flink job"
 > git rev-list --no-walk de5c56a5b8a8a030e7e67323a696d52495e37f7f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1545143291348946793.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0620150947 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3fbd4hezh3uzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #740

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/740/display/redirect?page=changes>

Changes:

[Pablo Estrada] Removing playground from main page to remove scrolling issue

[noreply] Merge pull request #21940 from [21941] Fix no output timestamp case


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision de5c56a5b8a8a030e7e67323a696d52495e37f7f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f de5c56a5b8a8a030e7e67323a696d52495e37f7f # timeout=10
Commit message: "Merge pull request #21940 from [21941] Fix no output timestamp case"
 > git rev-list --no-walk 525a169e6f807e301f1ac5e039645d4961da18d7 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8547657782720283792.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0619150848 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cpfqzuzkdzli4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #739

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/739/display/redirect?page=changes>

Changes:

[yathu] Unsickbay copy_rewrite_token tests

[Kenneth Knowles] Suppress unneeded spotbugs unused store warnings

[Kenneth Knowles] Eliminate nullness errors in KafkaIO

[yathu] Fix beam_PostCommit_Java_Sickbay build

[bulat.safiullin] [Website] add publishdate attribute to frontmatter

[noreply] Add guidance on self-assigning/closing to issue templates (#21931)

[noreply] Update names.py

[noreply] [Website] add new case-study, fix styles, add related images (#21891)

[noreply] Merge pull request #21928 from [Fixes #21927] Compress

[noreply] BigQueryIO: Adding the BASIC view setting to getTable request  (#21879)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 525a169e6f807e301f1ac5e039645d4961da18d7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 525a169e6f807e301f1ac5e039645d4961da18d7 # timeout=10
Commit message: "Merge pull request #21933 from Update container tags used by Dataflow runner with unreleased SDKs"
 > git rev-list --no-walk b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3483872363968078278.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0618150548 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gmwc7elo27pno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #738

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/738/display/redirect?page=changes>

Changes:

[yathu] [BEAM-3177][BEAM-5468] Add pipeline options to set default logging level

[noreply] Remove dataframe warnings from py38-docs logs (#21861)

[noreply] Update references to Jira to GH for the Java SDK (#21836)

[noreply] [21709] - Fix for "beam_PostCommit_Java_ValidatesRunner_Samza Failing"

[noreply] Update references to jira to GH for the Runners (#21835)

[noreply] Update remaining references to Jira to GH (#21834)

[ahmedabualsaud] test fixes

[ahmedabualsaud] no need for this line

[Kenneth Knowles] Re-activate nullness checking for some of sdks/java/core/coders

[noreply] Expand pr bot to python (#21791)

[noreply] Update run inference documentation (#21921)

[noreply] Consider skipped checks successful (#21924)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b5ea07d77c0a7200aaa6af51b3d48d5a4da7f817 # timeout=10
Commit message: "Consider skipped checks successful (#21924)"
 > git rev-list --no-walk bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8174678518754373345.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0617150541 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zwwg44uhvkjue

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #737

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/737/display/redirect?page=changes>

Changes:

[naireenhussain] convert windmill min timestamp to beam min timestamp

[nielm] Add Spanner Integration tests to verify exception handling

[egalpin] Drops usage of setWindowingStrategyInternal in favour of direct use of

[noreply] Switch go todos from issue # syntax to links (#21890)

[Valentyn Tymofieiev] Rollback dill.

[noreply] Add Pytorch image segmentation example (#21766)

[noreply] Add README documentation for scikit-learn MNIST example (#21887)

[noreply] Decompose labels for new issues (#21888)

[noreply] Use Go 1.18 for go-licenses (#21896)

[egalpin] Gives unique names to ES IO Write windowing

[noreply] [BEAM-12903] Cron job to cleanup Dataproc leaked resources (#21779)

[noreply] [BEAM-7209][BEAM-9351][BEAM-9428] Upgrade Hive to version 3.1.3 (#17749)

[noreply] Sharding IO tests (Kafka, Debezium, JDBC, Kinesis, Neo4j) from the

[noreply] Merge pull request #17604 from [BEAM-14315] Match updated files

[noreply] Merge pull request #21781 from Sklearn Mnist example and IT test

[Pablo Estrada] Update Python base image requirements

[noreply] Get the latest version of go-licenses (#21901)

[noreply] Hide internal helpers added to DoFn for batched DoFns (#21860)

[noreply] Updated documentation for ml.inference docs. (#21868)

[Pablo Estrada] Moving to 2.41.0-SNAPSHOT on master branch.

[noreply] Add a type hint to nexmark query 3 joinFn (#21873)

[Kenneth Knowles] Revert "convert windmill min timestamp to beam min timestamp"

[noreply] Fix a few small config issues (#21909)

[dannymccormick] Update py to python label

[noreply] Daily p0/p1/flaky reports for issues (#21725)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bcdc5392c2175a48c9c4f75bf5d3b57a4d15ac85 # timeout=10
Commit message: "Daily p0/p1/flaky reports for issues (#21725)"
 > git rev-list --no-walk 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4619891630070967870.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0616153037 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 4s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/co5ggsl4fcimm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #736

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/736/display/redirect?page=changes>

Changes:

[nielm] Add transform names to help debug flaky test

[dannymccormick] Mark issues as triaged when they are assigned

[chamikaramj] Automatically enable Runner v2 for pipelines that use cross-language

[bulat.safiullin] [BEAM-13229] side nav bug fixed

[bulat.safiullin] fix links for pipelines

[noreply] Split PytorchModelHandler into PytorchModelHandlerTensor and

[noreply] Fix Hadoop Downloader Range not correct (#21778)

[noreply] [BEAM-14036] Read Configuration for Pub/Sub SchemaTransform (#17730)

[noreply] [Go SDK] Add more info to Worker Status API (#21776)

[noreply] Make PeriodicImpulse generates unbounded PCollection (#21815)

[noreply] [BEAM-14267] Update watchForNewFiles to allow watching updated files

[noreply] fix timestamp conversion in Google Cloud Datastore Connector (#17789)

[noreply] Update references to Jira to GH for the Go label (#21830)

[noreply] [#21853] Adjust Go cross-compile to target entire package (#21854)

[Kenneth Knowles] Adjust Jenkins configuration to allow more memory per JVM

[noreply] [BEAM-14553] Add destination coder to FileResultCoder components

[noreply] copyedited README for RunInference examples (#21855)

[noreply] Document and test overriding batch type inference (#21844)

[noreply] Update references to Jira to GH for the Python SDK (#21831)

[noreply] add highlights to changes (#21865)

[noreply] Merge pull request #21793: [21794 ] Fix output timestamp in Dataflow.

[noreply] Adding more info to the sdk_worker_parallelism description (#21839)

[noreply] Add Bert Language Modeling example (#21818)

[noreply] [BEAM-14524] Returning NamedTuple from RunInference transform (#17773)

[noreply] Unit tests for RunInference keyed/unkeyed Modelhandler and examples

[noreply] Remove kwargs and add explicit runinference_args (#21806)

[noreply] Modify README for 3 pytorch examples (#21871)

[noreply] Sickbay Pytorch example IT test (#21857)

[noreply] Add required=True to Pytorch image classification example (#21883)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a74f17a4c11955eb54c0bc6aae4ba42c225fbea # timeout=10
Commit message: "Add required=True to Pytorch image classification example (#21883)"
 > git rev-list --no-walk 12ba4cea9d6a76a522106e6bb55f46fed091669f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5701935074059228796.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0615152621 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1m 18s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3hibnz2j6da22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #735

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/735/display/redirect?page=changes>

Changes:

[noreply] Bump cloud.google.com/go/pubsub from 1.21.1 to 1.22.2 in /sdks

[dannymccormick] Stop collecting jira metrics

[dannymccormick] Move to contains notation

[dannymccormick] fix query to get all updated issues

[noreply] Add RunInference API to CHANGES.md (#21754)

[Kenneth Knowles] Do not allow postcommit jobs phrase triggering

[noreply] Refactor API code to base.py in RunInference (#21801)

[noreply] Provide a diagnostic error message when a filesystem scheme is not

[Kiley Sok] Disable more triggers

[noreply] [BEAM-14532] Add integration testing to fhirio Read transform (#17803)

[noreply] Merge pull request #17794 from [#21252] Enforce pubsub message

[noreply] Separated pandas and numpy implementations of sklearn. (#21803)

[noreply] Composite triggers and unit tests for Go SDK (#21756)

[Kiley Sok] Enable phrase trigger for a few post commits

[Kiley Sok] spotless

[noreply] [BEAM-14557] Read and Seek Runner Capabilities in Go SDK  (#17821)

[noreply] [BEAM-13806] Add x-lang BigQuery IO integration test to Go SDK. (#16818)

[Jan Lukavský] [BEAM-14265] Add watermark hold for all timers

[noreply] Bump Python beam-master container (#21820)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 12ba4cea9d6a76a522106e6bb55f46fed091669f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 12ba4cea9d6a76a522106e6bb55f46fed091669f # timeout=10
Commit message: "Bump Python beam-master container (#21820)"
 > git rev-list --no-walk 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6825437849351840175.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0614152215 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 54s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7goqvhjtiohh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #734

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/734/display/redirect?page=changes>

Changes:

[noreply] Refactor code according to keyedModelHandler changes (#21819)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 63cd54e2e2b18d6d673adeae72fe4f60a3d8732f # timeout=10
Commit message: "Refactor code according to keyedModelHandler changes (#21819)"
 > git rev-list --no-walk b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7121477853505068562.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/luxfibufsyzro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #733

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/733/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
Commit message: "Make keying of examples explicit. (#21777)"
 > git rev-list --no-walk b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2140576185273409715.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/423h3leped4qe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #732

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/732/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14535] Added support for pandas in sklearn inference runner

[noreply] Merge ModelLoader and InferenceRunner into same class. (#21795)

[noreply] Merge pull request #17589 from [BEAM-14422] Exception testing for

[noreply] Add README for image classification example (#21758)

[anandinguva98] fixup: bug

[noreply] Fix every PR linking to PR 123 (#21802)

[noreply] Add native PubSub IO prototype to Go (#17955)

[noreply] Allow creation of dynamically defined transforms in the Python expansion

[noreply] Make keying of examples explicit. (#21777)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b8e2e85ab1fb37a2f89ed20d88730e591ea3bf7e # timeout=10
Commit message: "Make keying of examples explicit. (#21777)"
 > git rev-list --no-walk 0de98210f4531fbfd88265bc02052b27bd299602 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1812392021055503613.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5atanmbnsj2gw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #731

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/731/display/redirect?page=changes>

Changes:

[dannymccormick] Update dashboards to use gh data instead of jira data

[noreply] Merge pull request #21746: Exclude GCP Java packages from Dependabot

[noreply] Update .test-infra/metrics/grafana/dashboards/source_data_freshness.json

[noreply] Better cross langauge support for dataframe reads. (#21762)

[noreply] Add template_location flag to Go Dataflow runner (#21774)

[noreply] [BEAM-14406] Drain test for SDF truncation in Go SDK (#17814)

[noreply] More Jira -> Issues doc updates (#21770)

[noreply] [BEAM-11104] Add code snippet for Go SDK Self-Checkpointing (#17956)

[noreply] [BEAM-13769]Add no_xdist marker for cloudpickle test (#17538)

[noreply] [BEAM-14533] Bump cloudpickle to 2.1.0 (#17780)

[noreply] Add basic byte size estimation for batches (#17771)

[noreply] Add @yields_batches and @yields_elements (#19268)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0de98210f4531fbfd88265bc02052b27bd299602 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0de98210f4531fbfd88265bc02052b27bd299602 # timeout=10
Commit message: "Add @yields_batches and @yields_elements (#19268)"
 > git rev-list --no-walk 67533d17fd70c0c8994a3eb758b175dddfaea83b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2098260884467871742.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0610150747 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 16s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j7bmvhjhjmcz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #730

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/730/display/redirect?page=changes>

Changes:

[nishantjain] [BEAM-14000] Elastic search IO doesnot work when both username/password

[nishantjain] Fixes issue with httpclientbuilder - Use the existing builder instead of

[nishantjain] moves sslcontext towards starting of function

[nishantjain] adds unit test

[nishantjain] changes unit test to directly built restclient

[nishantjain] changes name of unit test

[nishantjain] adds test to all elasticsearch folder

[nishantjain] updates changes.md

[nishantjain] spotless fix

[dannymccormick] Gather metrics on GH Issues

[dannymccormick] Fixes

[dannymccormick] Fixes

[dannymccormick] Comment + naming fix

[dannymccormick] Conflicts fix

[dannymccormick] Ordering

[dannymccormick] Different fallback for prs/issues

[noreply] Add ability to self-assign issues for non-committers (#21719)

[dannymccormick] Fix sync time

[noreply] Dont try to generate jiras as part of dependency report (#21753)

[noreply] Allow users to comment `.take-issue` without taking (#21755)

[noreply] Merge pull request: [Beam-14528]: Add ISO time format support for

[noreply] Update all links to in progress jiras to issues (#21749)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 67533d17fd70c0c8994a3eb758b175dddfaea83b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 67533d17fd70c0c8994a3eb758b175dddfaea83b # timeout=10
Commit message: "Merge pull request #17297 from nishantjain91/elasticsearch_fix"
 > git rev-list --no-walk a1c3d0cd60d686196e8643ebf3eef9816a24b66a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7748109413361912237.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0609150610 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tzucxp7z24z7i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #729

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/729/display/redirect?page=changes>

Changes:

[nielm] Fix SpannerIO service call metrics and improve tests.

[andyye333] Add Pytorch support for batched keyed examples

[andyye333] Add general support for non-batchable kwargs params; Add

[noreply] [BEAM-12554] Create new instances of FileSink in sink_fn (#17708)

[noreply] DataflowRunner: Experiment added to disable unbounded PCcollection

[vachan] Fix for increased FAILED_PRECONDITION errors in BQ Read API.

[noreply] More flexible Python Callable type. (#17767)

[noreply] Fix typos in README (#17675)

[vachan] Adding comments.

[noreply] Bump google.golang.org/api from 0.81.0 to 0.83.0 in /sdks (#21743)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a1c3d0cd60d686196e8643ebf3eef9816a24b66a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a1c3d0cd60d686196e8643ebf3eef9816a24b66a # timeout=10
Commit message: "Bump google.golang.org/api from 0.81.0 to 0.83.0 in /sdks (#21743)"
 > git rev-list --no-walk e62ae391985fc13c7df1ee6e088525835ceaa560 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8236395090098188378.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0608150551 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3hbaqfspkgdt2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #728

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/728/display/redirect?page=changes>

Changes:

[yathu] [BEAM-14471] Fix PytestUnknownMarkingWarning

[Robert Bradshaw] Populate missing display data for remotely expanded transforms.

[Robert Bradshaw] Add an option to run Python operations in-line when invoked as a remote

[Robert Bradshaw] Pass options underlying runner in remote job service.

[noreply] Update Jira -> Issues in the Readme

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job

[noreply] Clean up uses of == instead of === in ts sdk (#17732)

[Robert Bradshaw] Comment, lint fixes.

[noreply] Mount GCP credentials in local docker environments. (#19265)

[noreply] [BEAM-14068]Add Pytorch inference IT test and example (#17462)

[noreply] [Playground] [Hotfix] Remove autoscrolling from embedded editor (#21717)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e62ae391985fc13c7df1ee6e088525835ceaa560 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e62ae391985fc13c7df1ee6e088525835ceaa560 # timeout=10
Commit message: "Merge pull request #17680: [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job"
 > git rev-list --no-walk 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8980759139499028493.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0607150601 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lfa7trpstnzww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #727

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/727/display/redirect?page=changes>

Changes:

[noreply] [Fixes #18679] Ensure that usage of metrics on a template job reports an


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a7c9ce9b84c0e17db0647d1652ad01e0d527eee # timeout=10
Commit message: "[Fixes #18679] Ensure that usage of metrics on a template job reports an error (#18905)"
 > git rev-list --no-walk 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2505222853943532384.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0606150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/m3bg54tdq2rli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #726

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/726/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4dce7b8857f37608321253073745fe7611a48af9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
Commit message: "[BEAM-14556] Honor the formatter installed on the root handler. (#17820)"
 > git rev-list --no-walk 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2312229268456118203.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0605150544 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/f5ty5ppiq7lxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #725

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/725/display/redirect?page=changes>

Changes:

[Pablo Estrada] Revert "Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ

[noreply] Alias worker_harness_container_image to sdk_container_image (#17817)

[noreply] [BEAM-14546] Fix errant pass for empty collections in Count (#17813)

[noreply] Merge pull request #17741 from [BEAM-14504] Add support for including

[noreply] Merge pull request #18374 from [BEAM-13945] Roll forward JSON support

[noreply] Merge pull request #17792 from [BEAM-13756] [Playground] Merge Log and

[noreply] Merge pull request #17779: [BEAM-14529] Add integer to float64

[noreply] [BEAM-14556] Honor the formatter installed on the root handler. (#17820)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4dce7b8857f37608321253073745fe7611a48af9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4dce7b8857f37608321253073745fe7611a48af9 # timeout=10
Commit message: "[BEAM-14556] Honor the formatter installed on the root handler. (#17820)"
 > git rev-list --no-walk 8e105977f963defeb9bbac5a94275cb356069c5a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6795697062723060415.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0604150559 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rjce5xw572n32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #724

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/724/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-14446] Update some docs to point to GitHub issues

[dannymccormick] More doc updates

[dannymccormick] Update issueManagement fields

[dannymccormick] Fix website build

[dannymccormick] Remove extraneous comment line

[noreply] Commit message guidance

[noreply] [BEAM-10976] Fix bug with bundle finalization on SDFs (and a small doc

[noreply] Bump google.golang.org/grpc from 1.46.2 to 1.47.0 in /sdks (#17806)

[noreply] Rename pytorch files (#17798)

[noreply] Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ

[noreply] [BEAM-11105] Add more watermark estimation docs for go (#17785)

[noreply] [BEAM-11106] documentation for SDF truncation in Go (#17781)

[noreply] [BEAM-11167] Updates dill package to version 0.3.5.1 (#17669)

[noreply] [BEAM-6258] Use gRPC 1.33.1 as min version to ensure that we pickup

[noreply] [BEAM-14441] Enable GitHub issues (#17812)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8e105977f963defeb9bbac5a94275cb356069c5a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8e105977f963defeb9bbac5a94275cb356069c5a # timeout=10
Commit message: "[BEAM-14441] Enable GitHub issues (#17812)"
 > git rev-list --no-walk 999bceab8e87d25f30faffe7d6431e2d8588663f # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8484491092093905707.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0603150543 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pcqpbzhuywoyg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #723

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/723/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Fix parsing of -PenableCheckerFramework in build

[Kenneth Knowles] Fix additional nullness errors in BigQueryIO

[yathu] [BEAM-13984] followup Fix precommit

[noreply] [BEAM-14513] Add read transform and initial healthcare client (#17748)

[noreply] [BEAM-14536] Handle 0.0 splits in offsetrange restriction (#17782)

[noreply] [BEAM-14470] Use lifecycle method names directly. (#17790)

[noreply] [BEAM-14297] add nullable annotations and an integration test (#17742)

[noreply] Only generate Javadocs for latest Spark runner version (Spark 3) to fix

[noreply] Fail Javadoc aggregateJavadoc task if there's an error (#17801)

[noreply] Merge pull request #17753 from [BEAM-14510] adding exception tests to

[noreply] feat: allow for unknown values in change streams (#17655)

[noreply] Support JdbcIO autosharding in Python (#16921)

[noreply] [BEAM-14511] Growable Tracker for Go SDK (#17754)

[noreply] [BEAM-14539] Ensure that the print stream can handle larger byte arrays


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 999bceab8e87d25f30faffe7d6431e2d8588663f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 999bceab8e87d25f30faffe7d6431e2d8588663f # timeout=10
Commit message: "[BEAM-14539] Ensure that the print stream can handle larger byte arrays being written and also allow for a growable amount of carry over. (#17787)"
 > git rev-list --no-walk ca33943808c56ce634c92eb85f865285c71ee048 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5595383732220588925.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/65zspmrmx3mtu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #722

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/722/display/redirect?page=changes>

Changes:

[chamikaramj] Adds Java cross-language transforms for invoking Python Map and FlatMap

[noreply] Merge pull request #17683 from [BEAM-14475] add test cases to GcsUtil

[noreply] [BEAM-14410] Add test to demonstrate BEAM-14410 issue in non-cython

[noreply] [BEAM-14449] Support cluster provisioning when using Flink on Dataproc

[noreply] [BEAM-14527] Implement "Beam Summit 2022" banner (#17776)

[noreply] Merge pull request #17222 from [BEAM-12164] Feat: Add new restriction

[noreply] Merge pull request #17598 from [BEAM-14451] Support export to BigQuery

[noreply] Add typing information to RunInferrence. (#17762)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ca33943808c56ce634c92eb85f865285c71ee048 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ca33943808c56ce634c92eb85f865285c71ee048 # timeout=10
Commit message: "Add typing information to RunInferrence. (#17762)"
 > git rev-list --no-walk 31114e893cea46834a7f92451c1c1c2633c8fa40 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4610402335849140739.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tuz4otv2ifwzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #721

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/721/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14255] Drop clock abstraction (#17671)

[noreply] Adds __repr__ to NullableCoder (#17757)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 31114e893cea46834a7f92451c1c1c2633c8fa40 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 31114e893cea46834a7f92451c1c1c2633c8fa40 # timeout=10
Commit message: "Adds __repr__ to NullableCoder (#17757)"
 > git rev-list --no-walk 9a6f7699b5d8daf846221d522d3702c5a4c7b562 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins634117498956676446.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jrmgwnjci3t7w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/720/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14170] - Create a test that runs sickbayed tests (#17471)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9a6f7699b5d8daf846221d522d3702c5a4c7b562 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9a6f7699b5d8daf846221d522d3702c5a4c7b562 # timeout=10
Commit message: "[BEAM-14170] - Create a test that runs sickbayed tests (#17471)"
 > git rev-list --no-walk 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3062760850238497700.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/exknzpdccimtc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/719/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fb68863779bb6cf082cd91331159e5743bb17d6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
Commit message: "cleaned up TypeScript in coders.ts (#17689)"
 > git rev-list --no-walk 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6255318813257170836.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zpber2gkf5tx6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/718/display/redirect?page=changes>

Changes:

[ilion.beyst] minor: don't capture stderr in kata tests

[Kiley Sok] Update beam-master version for legacy

[Heejong Lee] Fix NonType error when importing google.api_core fails

[noreply] [BEAM-13972] Update documentation for run inference (#17508)

[noreply] [BEAM-14502] Fix: Splitting scans into smaller chunks to buffer reads

[noreply] [BEAM-14218] Add resource location hints to base inference runner.

[noreply] [BEAM-14442] Ask for repro steps/redirect to user list in bug template

[noreply] [BEAM-14166] Push logic in RowWithGetters down into getters and use

[noreply] cleaned up TypeScript in coders.ts (#17689)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fb68863779bb6cf082cd91331159e5743bb17d6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fb68863779bb6cf082cd91331159e5743bb17d6 # timeout=10
Commit message: "cleaned up TypeScript in coders.ts (#17689)"
 > git rev-list --no-walk 57f37052067cc690d1515af0cddc604b9c325e11 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4455817510400533953.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wcaav6nfhms5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/717/display/redirect?page=changes>

Changes:

[thiagotnunes] BEAM-14419: Remove invalid mod type

[ihr] [BEAM-14006] Update Python katas to 2.38 and fix issue with one test

[Heejong Lee] [BEAM-14478] Fix missing 'projectId' attribute error

[relax] DLQ for BQ Storage Api writes

[noreply] Bump google.golang.org/api from 0.76.0 to 0.81.0 in /sdks

[noreply] [BEAM-14336] Re-enable `flight_delays_it_test` with

[noreply] [BEAM-11106] small nits to truncate sdf exec unit (#17755)

[noreply] Added standard logging when exception is thrown (#17717)

[noreply] [BEAM-13829] Enable worker status in Go

[noreply] [BEAM-14519] Add website page for Go dependencies (#17766)

[noreply] [BEAM-11106] Validate that DoFn returns Process continuation when

[noreply] [BEAM-14505] Add Dataflow streaming pipeline update support to the Go


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 57f37052067cc690d1515af0cddc604b9c325e11 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 57f37052067cc690d1515af0cddc604b9c325e11 # timeout=10
Commit message: "Merge pull request #17634 from iht/update_python_katas"
 > git rev-list --no-walk c5e521a85f93527b6b3fe20aea505206316ce7ce # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5947605995735129723.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yqmdqgc2xido2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/716/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-14426] Allow skipping of any output when writing an empty

[Robert Bradshaw] Add skip_if_empty attribute to base class to fix test.

[Jan Lukavský] [BEAM-14492] add flinkConfDir to FlinkPipelineOptions

[noreply] Bump cloud.google.com/go/storage from 1.22.0 to 1.22.1 in /sdks

[noreply] [BEAM-14139] Remove unused Flink 1.11 directory (#17750)

[noreply] [BEAM-14044] Allow ModelLoader to forward BatchElements args (#17527)

[noreply] [BEAM-14481] Remove unnecessary context (#17737)

[noreply] [BEAM-9324] Fix incompatibility of direct runner with cython (#17728)

[noreply] [BEAM-14503] Add support for Flink 1.15 (#17739)

[noreply] Update Beam website to release 2.39.0 (#17690)

[noreply] [BEAM-14509] Add several flags to dataflow runner (#17752)

[Yichi Zhang] Fix 2.38.0 download page.

[noreply] [BEAM-14494] Fix publish_docker_images.sh (#17756)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c5e521a85f93527b6b3fe20aea505206316ce7ce (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c5e521a85f93527b6b3fe20aea505206316ce7ce # timeout=10
Commit message: "Merge pull request #17715: [BEAM-14492] add flinkConfDir to FlinkPipelineOptions"
 > git rev-list --no-walk 3e683606d9a03e7da3d37a83eb16c3a6b96068cd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7690669058136473951.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dtzk5rphffjk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/715/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14471] Adding testcases and examples for xlang Python

[Heejong Lee] update

[Heejong Lee] add DataframeTransform wrapper

[noreply] [BEAM-14298] resolve dependency

[noreply] Fix -- linting issue (#17738)

[noreply] Fix 'NoneType' object has no attribute error

[noreply] [BEAM-12308] change expected value in kakfa IT (#17740)

[noreply] [BEAM-14053] [CdapIO] Add wrapper class for CDAP plugin (#17150)

[noreply] [BEAM-14129] Clean up PubsubLiteIO by removing options that no longer

[noreply] [BEAM-14496] Ensure that precombine is inheriting one of the timestamps


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3e683606d9a03e7da3d37a83eb16c3a6b96068cd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3e683606d9a03e7da3d37a83eb16c3a6b96068cd # timeout=10
Commit message: "[BEAM-14496] Ensure that precombine is inheriting one of the timestamps output values (#17729)"
 > git rev-list --no-walk acea4027b6dd6726d838eaf50dfb5e1605bdf266 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5882662619506502803.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hoewgbm6atxt2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/714/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14494] Tag rc dockre container with format ${RELEASE}rc${RC_NUM}

[noreply] [BEAM-11578] Fix TypeError in dataflow_metrics has 0 distribution sum

[noreply] [BEAM-14499] Step global, unbounded side input case back to warning

[noreply] [BEAM-14484] Step back unexpected primary handling to warnings (#17724)

[noreply] [BEAM-14486] Document pubsubio & fix its behavior. (#17709)

[noreply] [BEAM-14489] Remove non-SDF version of TextIO. (#17712)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision acea4027b6dd6726d838eaf50dfb5e1605bdf266 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f acea4027b6dd6726d838eaf50dfb5e1605bdf266 # timeout=10
Commit message: "[BEAM-14489] Remove non-SDF version of TextIO. (#17712)"
 > git rev-list --no-walk 1dfab628d03e161cf003dad01f55b9d6674aa8e2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8654532940689401765.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h2y7yiqeaxjn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/713/display/redirect?page=changes>

Changes:

[noreply] Add clarification on Filter transform's input function to pydoc.

[noreply] [BEAM-14367]Flaky timeout in


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1dfab628d03e161cf003dad01f55b9d6674aa8e2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1dfab628d03e161cf003dad01f55b9d6674aa8e2 # timeout=10
Commit message: "[BEAM-14367]Flaky timeout in StatefulDoFnOnDirectRunnerTest.test_dynamic_timer_clear_then_set_timer (#17569)"
 > git rev-list --no-walk 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9125435239829279642.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yxrqhv3w6y2m6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/712/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0c9cf43a7edae2e2a2622a8f4241b64a638121bb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
Commit message: "[BEAM-13015] Only create a TimerBundleTracker if there are timers. (#17445)"
 > git rev-list --no-walk 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6438190557656233332.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bezi24isaxpfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/711/display/redirect?page=changes>

Changes:

[yathu] Add labels for typescript PRs

[noreply] Bump google.golang.org/grpc from 1.45.0 to 1.46.2 in /sdks (#17677)

[noreply] [BEAM-13015] Only create a TimerBundleTracker if there are timers.


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0c9cf43a7edae2e2a2622a8f4241b64a638121bb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0c9cf43a7edae2e2a2622a8f4241b64a638121bb # timeout=10
Commit message: "[BEAM-13015] Only create a TimerBundleTracker if there are timers. (#17445)"
 > git rev-list --no-walk 301acc825a808ae1d62f5115601a7d81b2514e7d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2507591418752031365.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xpmfqm2o4qqlc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/710/display/redirect?page=changes>

Changes:

[chamikaramj] Corrects I/O connectors availability status in Beam Website.

[singh.vikash2310] fixed typos in README.md

[noreply] Update the PTransform and associated APIs to be less class-based.

[noreply] Vortex performance improvement: Enable multiple stream clients per

[noreply] [BEAM-14488] Alias async flags. (#17711)

[noreply] [BEAM-14487] Make drain & update terminal states. (#17710)

[noreply] [BEAM-14484] Improve behavior surrounding primary roots in

[noreply] Improve validation error message (#17719)

[noreply] Remove unused validation configurations. (#17705)

[bulat.safiullin] [BEAM-14418] added arrows to slider

[noreply] Minor: Bump Dataflow container versions (#17684)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 301acc825a808ae1d62f5115601a7d81b2514e7d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 301acc825a808ae1d62f5115601a7d81b2514e7d # timeout=10
Commit message: "Merge pull request #17722: [BEAM-14418] added arrows to slider"
 > git rev-list --no-walk 212d63d291b0c4cbc685c320ea5b8768b9234b64 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins893183476287720211.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hepz45lutjlci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/709/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-14428] change text, change styling of connectors and contribute

[noreply] [BEAM-10529] update KafkaIO Xlang integration test to publish and

[noreply] Fix a few small linting bugs (#17695)

[noreply] Bump github.com/lib/pq from 1.10.5 to 1.10.6 in /sdks (#17691)

[noreply] Update release-guide.md


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 212d63d291b0c4cbc685c320ea5b8768b9234b64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 212d63d291b0c4cbc685c320ea5b8768b9234b64 # timeout=10
Commit message: "Merge pull request #17572: [BEAM-14428] I/O, community, and contribute pages improvements"
 > git rev-list --no-walk 857f8d300d942177ebc4244b9b405222d7deb26d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9155362939050882918.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518185911 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 19s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gngmgv3yi4gb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/708/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12000] Update programming-guide.md (#17679)

[noreply] [BEAM-14467] Fix bug where run_pytest.sh does not elevate errors raised

[noreply] [BEAM-14474] Suppress 'Mean of empty slice' Runtime Warning in dataframe


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 857f8d300d942177ebc4244b9b405222d7deb26d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 857f8d300d942177ebc4244b9b405222d7deb26d # timeout=10
Commit message: "[BEAM-14474] Suppress 'Mean of empty slice' Runtime Warning in dataframe unit test (#17682)"
 > git rev-list --no-walk a37d324791b5e67d1b78c7e9cc0aaa5653b42826 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4468541818999584081.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0518150711 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ih3ixp7aoy3iw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/707/display/redirect?page=changes>

Changes:

[mmack] [BEAM-14334] Remove remaining forkEvery 1 from all Spark tests and stop

[noreply] [BEAM-14473] Throw error if using globally windowed, unbounded side

[noreply] [BEAM-14440] Add basic fuzz tests to the coders package (#17587)

[noreply] [BEAM-14035 ] Implement BigQuerySchema Read/Write TransformProvider

[noreply] Add Akvelon to case-studies (#17611)

[noreply] Merge pull request #17520 from BEAM-12356 Close DatasetService leaked

[noreply] Adding eslint and lint configuration to TypeScript SDK (#17676)

[noreply] Update release-guide.md

[noreply] Update release-guide.md

[noreply] [BEAM-14411] Re-enable TypecodersTest, fix most issues (#17547)

[noreply] Merge pull request #17678 from [BEAM-14460] [Playground] WIP. Fix error

[Alexey Romanenko] [BEAM-14035] Fix checkstyle issue

[noreply] [BEAM-14441] Automatically assign issue labels based on responses to

[noreply] README update for the Docker Error 255 during Website launch on Apple


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a37d324791b5e67d1b78c7e9cc0aaa5653b42826 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a37d324791b5e67d1b78c7e9cc0aaa5653b42826 # timeout=10
Commit message: "README update for the Docker Error 255 during Website launch on Apple Silicon (#17456)"
 > git rev-list --no-walk e6aab063e09ba52703e0417221de4c4466f8fd13 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins127776825587521914.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0517153729 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qav6w4pspp6lw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/706/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Update the SDK harness grouping table to be memory bounded

[noreply] [BEAM-13982] Added output of logging for python E2E pytests (#17637)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e6aab063e09ba52703e0417221de4c4466f8fd13 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e6aab063e09ba52703e0417221de4c4466f8fd13 # timeout=10
Commit message: "[BEAM-13982] Added output of logging for python E2E pytests (#17637)"
 > git rev-list --no-walk 5064cc247ba3ec2697cd7493b14cef8567d614f6 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5221359538103121011.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0516150708 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/f2jk6gpsu3vae

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/705/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14470] Use Generic Registrations in loadtests. (#17673)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5064cc247ba3ec2697cd7493b14cef8567d614f6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5064cc247ba3ec2697cd7493b14cef8567d614f6 # timeout=10
Commit message: "[BEAM-14470] Use Generic Registrations in loadtests. (#17673)"
 > git rev-list --no-walk 780ad62d42f8216ba030e97c203fc2310cd041b0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2403426245663029742.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0515150637 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zank6nzhhask6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/704/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14455] Add UUID to sub-schemas for PythonExternalTransform

[Heejong Lee] [BEAM-14430] Adding a logical type support for Python callables to Row

[Heejong Lee] add urn, type inference for PythonCallableSource

[Heejong Lee] fix lint errors

[Heejong Lee] move logical types def

[Heejong Lee] add micros_instant urn

[Heejong Lee] put a default type hint for PythonCallableSource

[Heejong Lee] add comment

[noreply] Revert "Better test assertion. (#17551)"

[noreply] Bump github.com/spf13/cobra from 1.3.0 to 1.4.0 in /sdks (#17647)

[noreply] [BEAM-14465] Reduce DefaultS3ClientBuilderFactory logging to debug level

[noreply] Merge pull request #17365 from [BEAM-12482] Update Schema Destination

[noreply] [BEAM-14014] Support impersonation credentials in dataflow runner

[noreply] [BEAM-14469] Allow nil primary returns from TrySplit in  a single-window

[noreply] Add some auto-starting runners to the typescript SDK. (#17580)

[noreply] [BEAM-14371] (and BEAM-14372) - enable a couple staticchecks (#17670)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 780ad62d42f8216ba030e97c203fc2310cd041b0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 780ad62d42f8216ba030e97c203fc2310cd041b0 # timeout=10
Commit message: "[BEAM-14371] (and BEAM-14372) - enable a couple staticchecks (#17670)"
 > git rev-list --no-walk 787479f1a5e178333ded3ff02331163c4fe75f1a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2338174919279329659.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0514150554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4y2otid5yulri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/703/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-14441] Add GitHub issue templates

[dannymccormick] Ask for beam version + other dependencies

[dannymccormick] We don't need outage

[dannymccormick] Cut p4

[chamikaramj] Updates CHANGES.md to include some recently discovered known issues

[dannymccormick] Pare down to fewer templates

[noreply] Revert "[BEAM-14429] Force java load test on dataflow runner v2

[noreply] [BEAM-14347] Add generic registration feature to CHANGES (#17643)

[noreply] Better test assertion. (#17551)

[noreply] Bump github.com/google/go-cmp from 0.5.7 to 0.5.8 in /sdks (#17628)

[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#17627)

[noreply] Bump github.com/lib/pq from 1.10.4 to 1.10.5 in /sdks (#17626)

[noreply] Merge pull request #17584 from [BEAM-14415] Exception handling tests and

[noreply] Bump cloud.google.com/go/pubsub from 1.18.0 to 1.21.1 in /sdks (#17646)

[noreply] Merge pull request #17408 from [BEAM-14312] [Website] change section

[noreply] Bump cloud.google.com/go/bigquery from 1.28.0 to 1.32.0 in /sdks

[noreply] [BEAM-14347] Add function for simple function registration (#17650)

[noreply] Drop dataclasses requirement, we only support python 3.7+ (#17640)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 787479f1a5e178333ded3ff02331163c4fe75f1a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 787479f1a5e178333ded3ff02331163c4fe75f1a # timeout=10
Commit message: "Drop dataclasses requirement, we only support python 3.7+ (#17640)"
 > git rev-list --no-walk fd61a90057011270dbf9a36c73b5baaf120100e2 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8937874220063793751.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0513150610 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy and 1 stopped Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q65m3ko6p6ble

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/702/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-14096] bump junit-quickcheck to 1.0

[noreply] [BEAM-11104] Add self-checkpointing to CHANGES.md (#17612)

[noreply] [BEAM-14081] [CdapIO] Add context classes for CDAP plugins (#17104)

[noreply] [BEAM-12526] Add Dependabot (#17563)

[noreply] Remove python 3.6 postcommit from mass_comment.py (#17630)

[noreply] [BEAM-14347] Add some benchmarks for generic registration (#17613)

[noreply] Correctly route go dependency changes to go label (#17632)

[noreply] [BEAM-13695] Add jamm jvm options to Java 11 (#17178)

[noreply] [BEAM-14334] Fix leakage of SparkContext in Spark runner tests to remove

[noreply] Typo & link update (#17633)

[noreply] Trigger go precommits on go mod/sum changes (#17636)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fd61a90057011270dbf9a36c73b5baaf120100e2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fd61a90057011270dbf9a36c73b5baaf120100e2 # timeout=10
Commit message: "Trigger go precommits on go mod/sum changes (#17636)"
 > git rev-list --no-walk 0f38c82007bee45c375ec75a5c7af2c672483a19 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2192723667681767660.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0512150625 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 15s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/epfzqhm7h7jbm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/701/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-14448] add datastore test

[yathu] [BEAM-14423] Add test cases for BigtableIO.BigtableWriterFn fails due to

[Pablo Estrada] Revert "Merge pull request #17517 from [BEAM-14383] Improve "FailedRows"

[noreply] [BEAM-14229] Fix SyntheticUnboundedSource duplication from checkpoint

[noreply] [BEAM-14347] Rename registration package to register (#17603)

[noreply] [BEAM-11104] Add self-checkpointing integration test (#17590)

[noreply] [BEAM-5492] Python Dataflow integration tests should export the pipeline

[noreply] [BEAM-14396] Bump httplib2 upper bound. (#17602)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0f38c82007bee45c375ec75a5c7af2c672483a19 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0f38c82007bee45c375ec75a5c7af2c672483a19 # timeout=10
Commit message: "[BEAM-14396] Bump httplib2 upper bound. (#17602)"
 > git rev-list --no-walk 5c21fbccec5e1e831dd0040bd7f631c050865430 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4770161501794335272.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0511150544 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dkbqn64q3jm6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/700/display/redirect?page=changes>

Changes:

[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Spark job

[noreply] Merge pull request #17559 from [BEAM-14423] Add exception injection

[noreply] [BEAM-11104] Allow self-checkpointing SDFs to return without finishing

[noreply] Merge pull request #17544 from [BEAM-14415] Exception handling tests for

[noreply] Merge pull request #17565 from [BEAM-14413] add Kafka exception test

[noreply] Merge pull request #17555 from [BEAM-14417] Adding exception handling

[noreply] [BEAM-14433] Improve Go split error message. (#17575)

[noreply] [BEAM-14429] Force java load test on dataflow runner v2

[noreply] Merge pull request #17577 from [BEAM-14435] Adding exception handling

[noreply] [BEAM-14347] Add generic registration functions for iters and emitters

[noreply] [BEAM-14169] Add Credentials rotation cron job for clusters (#17383)

[noreply] [BEAM-14347] Add generic registration for accumulators (#17579)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5c21fbccec5e1e831dd0040bd7f631c050865430 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5c21fbccec5e1e831dd0040bd7f631c050865430 # timeout=10
Commit message: "Merge pull request #15679 from aromanenko-dev/BEAM-12918-tpcds-jenkins"
 > git rev-list --no-walk 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7381207504597737380.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0510150620 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p7sy5ylo5vcxq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/699/display/redirect?page=changes>

Changes:

[elias.segundo] Changing elegibility to AllNodeElegibility

[chamikaramj] Adds code reviewers for GCP I/O connectors and KafkaIO to Beam OWNERS

[andyye333] Add extra details to PubSub matcher errors


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0ea809590ee3fa271b609e02d17bd1c9ec1eddf9 # timeout=10
Commit message: " [BEAM-14439] [BEAM-12673] Add extra details to PubSub matcher errors #17586"
 > git rev-list --no-walk 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins865063587422456133.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0509150552 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 13s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dcztjjog4ohqa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/698/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 70b7567de56af29745d98d5d24d2e2427045dd9d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
Commit message: "Merge pull request #17482 from ihji/BEAM-14374"
 > git rev-list --no-walk 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins9097543835111371904.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0508150546 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/37jwps7v2az3i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/697/display/redirect?page=changes>

Changes:

[kevinsijo] Setting up a basic directory

[kevinsijo] Mirroring Python SDK's directory structure

[kerrydc] Adds initial tests

[kevinsijo] 'runners' is the correct directory name

[Pablo Estrada] sketching the core API for JS SDK

[jonathanlui] add .gitignore for node/ts project

[Robert Bradshaw] Worker directory.

[Robert Bradshaw] Fix complile errors with explicit any for callables.

[Robert Bradshaw] Add worker entry point.

[Robert Bradshaw] Add proto generation code.

[Robert Bradshaw] Add generated proto files.

[Robert Bradshaw] Attempts to get ts protos to compile.

[Robert Bradshaw] Exclude ts protos for now.

[Robert Bradshaw] More changes to get ts protos working.

[Robert Bradshaw] Update scripts and config to get protos compiling.

[Robert Bradshaw] Update geenrated files.

[jonathanlui] add build and clean script to compile ts

[Robert Bradshaw] Generate server for loopback worker.

[Robert Bradshaw] Generated grpc servers for loopback.

[Robert Bradshaw] Add typescript formatter.

[Robert Bradshaw] Loopback server (that does nothing).

[Robert Bradshaw] Working server.

[Pablo Estrada] Starting expansion of primitive transforms

[Pablo Estrada] Starting to implement and support standard coders

[Robert Bradshaw] Also generate grpc clients.

[Robert Bradshaw] Basic implementation of worker harness.

[Pablo Estrada] fix the build

[Robert Bradshaw] Add some missing files for worker harness.

[Robert Bradshaw] Refactor operators to use registration.

[jonathanlui] enable ts in mocha

[jonathanlui] update readme

[jonathanlui] --save-dev @types/mocha

[jonathanlui] translate core_test.js to typescript

[Robert Bradshaw] Encapsulate worker service in a class.

[Kenneth Knowles] Port standard_coders_test to typescript (superficially)

[Pablo Estrada] Starting the proto translation of Impulse, ParDo, GBK

[Robert Bradshaw] Add some tests for the worker code.

[Robert Bradshaw] Fixing old lock file error.

[Pablo Estrada] Adding transform names and fixing GBK coder issue

[Robert Bradshaw] npx tsfmt -r src/apache_beam/base.ts src/apache_beam/transforms/core.ts

[Kenneth Knowles] switch to import style require() statements

[Kenneth Knowles] Add Coder interface using protobufjs classes

[Kenneth Knowles] BytesCoder with some failures

[noreply] Added GeneralObjectCoder and using it as coder for most transforms (#9)

[Kenneth Knowles] Fix order of arguments to deepEqual

[Kenneth Knowles] Encode expected encoding as binary

[Robert Bradshaw] Refactor API to allow for composites.

[jrmccluskey] Initial setup for automated Java expansion startup

[jrmccluskey] Update exp_service.ts

[Kenneth Knowles] Fix up coder deserialization

[Robert Bradshaw] Simplify GBK coder computation.

[Robert Bradshaw] Remove top-level PValue.

[Pablo Estrada] Make tests green

[Robert Bradshaw] Rename PValueish to PValue.

[jonathanlui] node runner

[jonathanlui] whitespaces

[Robert Bradshaw] Make Runner.run async.

[jonathanlui] bson and fast-deep-equal should not be listed as devdependency

[jrmccluskey] Add basic Dockerfile that starts ExternalWorkerPool

[Robert Bradshaw] Direct runner.

[kevinsijo] Testing expansion service communication

[Robert Bradshaw] Added flatten, assertion checkers.

[Pablo Estrada] progress on basic coders

[Robert Bradshaw] Fixing the build.

[Robert Bradshaw] Cleanup, simplify access.

[Pablo Estrada] Adding limited support for KVCoder and IterableCoder

[Robert Bradshaw] Introduce PipelineContext.

[Robert Bradshaw] Add toProto to all coders.

[Robert Bradshaw] Some work with coders.

[Robert Bradshaw] Remove debug logging.

[Robert Bradshaw] Use coders over data channel.

[Kenneth Knowles] explicitly sequence sub-coder serializations

[Kenneth Knowles] no more need to extend FakeCoder

[Kenneth Knowles] actually advance reader

[Kenneth Knowles] autoformat

[Kenneth Knowles] protobufjs already can write and read signed varints

[Kenneth Knowles] with improved test harness, kv has many more failures

[Kenneth Knowles] read bytescoder from correct position

[Kenneth Knowles] no more fake coders

[Kenneth Knowles] varint examples all work

[Kenneth Knowles] simplify coder value parsing

[Kenneth Knowles] global window coder

[Kenneth Knowles] fix swapEndian32

[Robert Bradshaw] Add P(...) operator.

[kevinsijo] Implementing RowCoder encoding.

[jrmccluskey] remove unused container dir

[kevinsijo] Corrected sorting of encoded positions to reflect an argsort instead.

[Robert Bradshaw] Populate environments.

[kevinsijo] Implementing RowCoder decoding.

[Kenneth Knowles] preliminary unbounded iterable coder

[Kenneth Knowles] friendlier description of standard coder test case

[Kenneth Knowles] fix test harness; iterable works

[jrmccluskey] first pass at boot.go

[jonathanlui] update package-lock.json

[jonathanlui] make NodeRunner a subclass of Runner

[jonathanlui] add waitUntilFinish interface member

[Pablo Estrada] Adding double coder

[Kenneth Knowles] scaffolding for windowed values

[Pablo Estrada] Adding type information to PColleciton and PTransform

[jonathanlui] fix direct runner

[Pablo Estrada] Adding typing information for DoFns

[Kenneth Knowles] add interval window

[Robert Bradshaw] Export PValue.

[Robert Bradshaw] Add CombineFn interface.

[Robert Bradshaw] Typed flatten.

[jonathanlui] add runAsync method to base.Runner

[Kenneth Knowles] add Long package

[Pablo Estrada] Adding more types. Making PValue typed

[Kenneth Knowles] instant coder draft

[Robert Bradshaw] Return job state from direct runner.

[Kenneth Knowles] type instant = long

[jonathanlui] implement NodeRunner.runPipeline

[Kenneth Knowles] autoformat

[kevinsijo] Completed implementation of basic row coder

[Kenneth Knowles] Fix IntervalWindowCoder, almost

[Kenneth Knowles] fix interval window coder

[Kenneth Knowles] autoformat

[Robert Bradshaw] loopback runner works

[Kenneth Knowles] move core element types into values.ts

[Kenneth Knowles] just build object directly to be cool

[Robert Bradshaw] GBK working on ULR.

[Robert Bradshaw] Async transforms.

[Robert Bradshaw] External transform grpah splicing.

[Kenneth Knowles] progress on windowed value: paneinfo encoding

[Robert Bradshaw] Fix merge.

[Robert Bradshaw] autoformat

[Kenneth Knowles] full windowed value coder

[kerrydc] Updates tests to use correct types, adds generics where needed to DoFns

[Robert Bradshaw] Add serialization librarires.'

[Robert Bradshaw] Add Split() PTransform, for producing multiple outputs from a single

[Robert Bradshaw] Schema-encoded external payloads.

[kevinsijo] Adding Schema inference from JSON

[Pablo Estrada] Removing unused directories

[Pablo Estrada] Support for finishBundle and improving typing annotations.

[Pablo Estrada] A base implementation of combiners with GBK/ParDo

[Robert Bradshaw] Fully propagate windowing information in both remote and direct runner.

[Robert Bradshaw] Make args and kwargs optional for python external transform.

[Robert Bradshaw] Infer schema for external transforms.

[Pablo Estrada] Implementing a custom combine fn as an example. Small fixes

[Robert Bradshaw] Fix missing windowing information in combiners.

[Robert Bradshaw] PostShuffle needn't group by key as that's already done.

[Robert Bradshaw] Guard pre-combine for global window only.

[Robert Bradshaw] WindowInto

[Robert Bradshaw] Fix optional kwargs.

[Robert Bradshaw] A couple of tweaks for js + py

[Robert Bradshaw] Add windowing file.

[Robert Bradshaw] CombineBy transform, stand-alone WordCount.

[Robert Bradshaw] cleanup

[Robert Bradshaw] Actually fix optional external kwargs.

[Robert Bradshaw] Demo2, textio read.

[Robert Bradshaw] Add command lines for starting up the servers.

[Robert Bradshaw] Run prettier on the full codebase.

[Robert Bradshaw] Update deps.

[Pablo Estrada] Adding docstrings for core.ts. Prettier dependency

[Pablo Estrada] Documenting coder interfaces

[Pablo Estrada] Added documentation for a few standard coders

[Robert Bradshaw] Unified grouping and combining.

[Robert Bradshaw] Allow PCollection ids to be lazy.

[Robert Bradshaw] Reorganize module structure.

[Robert Bradshaw] A couple more renames.

[Robert Bradshaw] Simplify.

[Robert Bradshaw] Consolidation.

[Robert Bradshaw] Fix build.

[Robert Bradshaw] Add optional context to ParDo.

[Robert Bradshaw] fixup: iterable coder endian sign issue

[Robert Bradshaw] omit context for map(console.log)

[Robert Bradshaw] Fix ReadFromText coders.

[Robert Bradshaw] Flesh out README with overview and current state.

[noreply] Readme typo

[Robert Bradshaw] Two more TODOs.

[noreply] Add a pointer to the example wordcount to the readme.

[Pablo Estrada] Documenting coders and implementing unknown-length method

[Robert Bradshaw] UIID dependency.

[Robert Bradshaw] Artifact handling.

[Robert Bradshaw] Properly wait on data channel for bundle completion.

[Robert Bradshaw] Automatic java expansion service startup.

[Robert Bradshaw] Process promises.

[Robert Bradshaw] Implement side inputs.

[Robert Bradshaw] Cleanup.

[Robert Bradshaw] Put complex constext stuff in its own file.

[Robert Bradshaw] Rename BoundedWindow to just Window.

[Robert Bradshaw] Alternative splitter class.

[Pablo Estrada] Documenting internal functions

[Robert Bradshaw] Take a pass clarifying the TODOs.

[Robert Bradshaw] Sql transform wrapper.

[Robert Bradshaw] Incorporate some feedback into the TODOs.

[Robert Bradshaw] More TODOs.

[Robert Bradshaw] Remove app placeholder.

[Robert Bradshaw] Apache license headers.

[Robert Bradshaw] More TODOs

[jankuehle] Suggestions for TypeScript todos

[dannymccormick] Add actions for typescript sdk

[dannymccormick] Fix test command

[noreply] Add missing version

[dannymccormick] Fix codecovTest command

[noreply] Only do prettier check on linux

[noreply] Only get codecov on linux

[Robert Bradshaw] Resolve some comments.

[Robert Bradshaw] Fix compile errors.

[Robert Bradshaw] Prettier.

[Robert Bradshaw] Re-order expandInternal arguments pending unification.

[Robert Bradshaw] More consistent and stricter PTransform naming.

[Robert Bradshaw] Notes on explicit, if less idiomatic, use of classes.

[Robert Bradshaw] Let DoFn be an interface rather than a class.

[Robert Bradshaw] Provide DoFn context to start and finish bundle.

[Robert Bradshaw] Optional promise code simplification.

[Robert Bradshaw] Cleanup todos.

[Robert Bradshaw] Avoid any type where not needed.

[Robert Bradshaw] Apache RAT excludes for typescript.

[Robert Bradshaw] Remove empty READMEs.

[Robert Bradshaw] Add licences statement to readme files.

[Robert Bradshaw] More RAT fixes.

[Robert Bradshaw] Another unsupported coder.

[Robert Bradshaw] Remove debugging code.

[noreply] Fix automatic naming with code coverage.

[Robert Bradshaw] Coders cleanup.

[Robert Bradshaw] Add tests for RowCoder.

[Robert Bradshaw] Normalize capitalization, comments.

[Robert Bradshaw] Install typescript closure packages.

[Robert Bradshaw] npm audit fix

[Robert Bradshaw] Move more imports out of base.

[Robert Bradshaw] Changes needed to compile with ts closure plugin.

[Robert Bradshaw] Use ttsc and ts-closure-transform plugin.

[Robert Bradshaw] Serialization registration to actually get serialization working.

[Robert Bradshaw] Container images working on local runner.

[Robert Bradshaw] Add a portable job server that proxies the Dataflow backend. (#17189)

[Robert Bradshaw] Improvements to dataflow job service for non-Python jobs.

[Robert Bradshaw] Get dataflow working.

[Robert Bradshaw] User friendly pipeline options.

[Robert Bradshaw] Less classes, more functions.

[Robert Bradshaw] Add new nullable standard coder.

[Robert Bradshaw] Make Apache Rat happy.

[Robert Bradshaw] Disable broken codecov.

[Robert Bradshaw] Remove last uses of base.ts.

[Robert Bradshaw] Remove unneedd file.

[Robert Bradshaw] Remove more uneeded/unused files.

[Robert Bradshaw] Cleanup tests.

[Robert Bradshaw] Minor cleanups to coder tests.

[noreply] Quote pip install package name

[noreply] [BEAM-14374] Fix module import error in FullyQualifiedNamedTransform

[Robert Bradshaw] Addressing issues from the review.

[noreply] Apply suggestions from code review.

[Robert Bradshaw] Post-merge fixes.

[dannymccormick] Delete tags.go

[Robert Bradshaw] Update tests to use our actual serialization libraries.

[Robert Bradshaw] Another pass at TODOs, removing finished items.

[Heejong Lee] [BEAM-14146] Python Streaming job failing to drain with BigQueryIO write

[Heejong Lee] add test

[noreply] Merge pull request #17490 from [BEAM-14370] [Website] Add new page about

[noreply] [BEAM-14332] Refactored cluster management for Flink on Dataproc

[noreply] [BEAM-13988] Update mtime to use time.UnixMilli() calls (#17578)

[noreply] Fixing patching error on missing dependencies (#17564)

[noreply] Merge pull request #17517 from [BEAM-14383] Improve "FailedRows" errors

[Heejong Lee] add test without mock


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 70b7567de56af29745d98d5d24d2e2427045dd9d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 70b7567de56af29745d98d5d24d2e2427045dd9d # timeout=10
Commit message: "Merge pull request #17482 from ihji/BEAM-14374"
 > git rev-list --no-walk 2af0dc79912011e46b297c2b8091a2ee0a191510 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8088870281924289066.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0507150554 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uj5outmg4rhca

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/696/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14173] Fix Go Loadtests on Dataflow & partial fix for Flink

[noreply] Upgrade python sdk container requirements. (#17549)

[noreply] Merge pull request #17497: [BEAM-11205] Update GCP Libraries BOM version

[noreply] [BEAM-12603] Add retry on grpc data channel and remove retry from test.

[noreply] Merge pull request #17359: [BEAM-14303] Add a way to exclude output

[Kenneth Knowles] Add parameter for service account impersonation in GCP credentials

[noreply] [BEAM-14347] Allow users to optimize DoFn execution with a single

[noreply] [BEAM-5878] Add (failing) kwonly-argument test (#17509)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2af0dc79912011e46b297c2b8091a2ee0a191510 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2af0dc79912011e46b297c2b8091a2ee0a191510 # timeout=10
Commit message: "Merge pull request #17394: [BEAM-14014] Add parameter for service account impersonation in GCP credentials"
 > git rev-list --no-walk 017f846ca342745cc1043c45b9ff25f6561d8dc0 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3989664279962806608.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0506150607 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ycrpouqvvdrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/695/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-9245] Unable to pull datatore Entity which contains dict

[bulat.safiullin] [BEAM-14382] [Website] add banner container for with css, images, html

[Jan Lukavský] [BEAM-14196] add test verifying output watermark propagation in bundle

[Jan Lukavský] [BEAM-14196] Fix FlinkRunner mid-bundle output watermark handling

[bulat.safiullin] [BEAM-14382] change mobile banner img, add padding to banner section

[ahmedabualsaud] fix test decotrator typo

[noreply] Merge pull request #17440 from [BEAM-14329] Enable exponential backoff

[noreply] [BEAM-11104] Fix output forwarding issue for ProcessContinuations

[noreply] re-add testing package to pydoc (#17524)

[Heejong Lee] add test

[noreply] [BEAM-14250] Amended the workaround (#17531)

[noreply] [BEAM-11104] Fix broken split result validation (#17546)

[noreply] Fixed a SQL and screenshots in the Beam SQL blog (#17545)

[noreply] Merge pull request #17417: [BEAM-14388] Address some performance

[noreply] [BEAM-14386] [Flink] Support for scala 2.12 (#17512)

[noreply] [BEAM-14294] Worker changes to support trivial Batched DoFns (#17384)

[zyichi] Moving to 2.40.0-SNAPSHOT on master branch.

[zyichi] Move master readme.md to 2.40.0

[noreply] [BEAM-14048] [CdapIO] Add ConfigWrapper for building CDAP PluginConfigs


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 017f846ca342745cc1043c45b9ff25f6561d8dc0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 017f846ca342745cc1043c45b9ff25f6561d8dc0 # timeout=10
Commit message: "Merge pull request #17552 from y1chi/update_md"
 > git rev-list --no-walk 43d488c55cd25290c6f560f6649597fcc00dcc42 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins21008417301839079.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0505150604 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...

Publishing failed.

The build scan server appears to be unavailable.
Please check https://status.gradle.com for the latest service status.

If the service is reported as available, please report this problem via https://gradle.com/help/plugin and include the following via copy/paste:

----------
Gradle version: 7.4
Plugin version: 3.4.1
Request URL: https://status.gradle.com
Request ID: 77b509a8-bc17-4e20-8caf-d6cd3bf2df71
Response status code: 405
Response server type: Varnish
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/694/display/redirect?page=changes>

Changes:

[noreply] fix: JDBC config schema fields order

[Brian Hulette] Revert "Merge pull request #17255 from kileys/test-revert"

[Brian Hulette] BEAM-14231: bypass schema cache for

[noreply] [BEAM-13657] Follow up update version warning in __init__ (#17493)

[noreply] Merge pull request #17431 from [BEAM-14273] Add integration tests for BQ

[noreply] Merge pull request #17205 from [BEAM-14145] [Website] add carousel to

[noreply] [BEAM-14064] fix es io windowing (#17112)

[noreply] [BEAM-13670] Upgraded ipython from v7 to v8 (#17529)

[noreply] [BEAM-11104] Enable ProcessContinuation return values, add unit test

[Robert Bradshaw] [BEAM-14403] Allow Prime to be used with legacy workers.

[noreply] [BEAM-11106] Support drain in Go SDK (#17432)

[noreply] add __Init__ to inference. (#17514)

[nielm] [BEAM-14405] Fix NPE when ProjectID is not specified in a template


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 43d488c55cd25290c6f560f6649597fcc00dcc42 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 43d488c55cd25290c6f560f6649597fcc00dcc42 # timeout=10
Commit message: "Merge pull request #17540: [BEAM-14405] Fix NPE when ProjectID is not specified in a template execution"
 > git rev-list --no-walk 0daef62a7bd993b13064de80588e343ee764e004 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6083874758662682046.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nrijpmjypb46y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/693/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Allow arithmetic between deferred scalars.

[noreply] [BEAM-8688] Upgrade GCSIO to 2.2.6 (#17486)

[noreply] [BEAM-14253] patch SubscriptionPartitionLoader to work around a dataflow

[noreply] Add website link log to notify user of pre-build workflow. (#17498)

[noreply] [BEAM-11105] Add timestamp observing watermark estimation (#17476)

[noreply] Merge pull request #17487 from Adding user-agent to GCS client in Python

[noreply] [BEAM-10265] Display error message if trying to infer recursive schema

[noreply] [BEAM-12575] Upgraded ipykernel from v5 to v6 (#17526)

[noreply] [BEAM-11105] Add docs + CHANGES.md entry for Go Watermark Estimation

[noreply] Merge pull request #17380 from [BEAM-14314][BEAM-9532] Add last_updated


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0daef62a7bd993b13064de80588e343ee764e004 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0daef62a7bd993b13064de80588e343ee764e004 # timeout=10
Commit message: "Merge pull request #17380 from [BEAM-14314][BEAM-9532] Add last_updated field in filesystem.FileMetaData"
 > git rev-list --no-walk e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins393121336740370924.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uzn5ftj3dq7oa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/692/display/redirect?page=changes>

Changes:

[yathu] [BEAM-14375] Fix Java Wordcount Dataflow postcommit

[noreply] [BEAM-11105] Add manual watermark estimation (#17475)

[noreply] [BEAM-14390] Set user-agent when pulling licenses to avoid 403s (#17521)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e0166e294be4e4b2a3d219d3d18af0fa78c8fc92 # timeout=10
Commit message: "Merge pull request #1748: [BEAM-14375] Fix Java Wordcount Dataflow postcommit for Gradle 7.4"
 > git rev-list --no-walk 4b413bbb5f8807b0f7a284fd818f2772f036fe55 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins8918485093624915538.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins
> Task :buildSrc:check
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 26s
10 actionable tasks: 8 executed, 1 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/u7aitczvohw7y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/691/display/redirect?page=changes>

Changes:

[noreply] Revert "Improvement to Seed job configuration to launch against PRs

[ilion.beyst] Minor: fix typo

[noreply] Merge pull request #17422 from [BEAM-14344]: remove tracing from


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b413bbb5f8807b0f7a284fd818f2772f036fe55 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b413bbb5f8807b0f7a284fd818f2772f036fe55 # timeout=10
Commit message: "Merge pull request #17515 from [BEAM-14377] Revert "Improvement to Seed job configuration so we can launch seed jobs against PRs""
 > git rev-list --no-walk 58b4d762eece66774a5df6ca54e6f91c49057c9b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins902256721302274975.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rfzrsgnwt37xg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/690/display/redirect?page=changes>

Changes:

[noreply] Revert "Merge pull request #17260 from [BEAM-13229] [Website] bug side

[noreply] [BEAM-14001] Add missing test cases to existing suites in exec package

[noreply] [BEAM-14243] Add staticcheck to Github Actions Precommits (#17479)

[noreply] [BEAM-14368][BEAM-13984]Change model loading from constructor to

[noreply] [BEAM-13983] changed file name from sklearn_loader to sklearn_inference

[noreply] Add SQL in Notebooks blog post (#17481)

[noreply] Merge pull request #17404: [BEAM-13990] support date and timestamp


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 58b4d762eece66774a5df6ca54e6f91c49057c9b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 58b4d762eece66774a5df6ca54e6f91c49057c9b # timeout=10
Commit message: "Merge pull request #17404: [BEAM-13990] support date and timestamp fields"
 > git rev-list --no-walk 8c4a056a63d92776ae9d6be726b37d789486afbd # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5886884703813794584.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ze4q7n73kh6lu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/689/display/redirect?page=changes>

Changes:

[ihr] Update Java katas to Beam 2.38

[Robert Bradshaw] Add element weighting parameter to BatchElements.

[noreply] [BEAM-14369] Fix "target/options: no such file or directory" error while

[noreply] [BEAM-14297] Enable nullable key and value arrays for xlang kafka io

[noreply] Merge pull request #17444 from [BEAM-14310] [Website] bug home

[noreply] Merge pull request #17388 from [BEAM-14311] [Website] Home Page

[noreply] [BEAM-14376] Typo in method description doc

[noreply] Add default classpath when not present (#17491)

[Robert Bradshaw] Clearer test.

[thiagotnunes] fix: update javadocs for ChangeStreamMetrics

[noreply] Merge pull request #17443 from [BEAM-12164]: use the end timestamp for

[noreply] Merge pull request #17260 from [BEAM-13229] [Website] bug side nav

[noreply] [BEAM-14351] Fix the template and move the announcement to the next


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8c4a056a63d92776ae9d6be726b37d789486afbd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8c4a056a63d92776ae9d6be726b37d789486afbd # timeout=10
Commit message: "Merge pull request #17465 Add element weighting parameter to BatchElements."
 > git rev-list --no-walk b0e6b561683425fe865720970ce60d45ecec11e4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4896086193213808317.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hdj4e3lwdyiis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/688/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #17226 from [BEAM-14204] [Playground] Tests for

[noreply] [BEAM-13015, BEAM-14184] Address unbounded number of messages being

[noreply] Improvement to Seed job configuration to launch against PRs (#17468)

[noreply] [BEAM-13983] Small changes to sklearn runinference (#17459)

[chamikaramj] Renames ExternalPythonTransform to PythonExternalTransform

[noreply] [BEAM-14351] Inherit from Coder. (#17437)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b0e6b561683425fe865720970ce60d45ecec11e4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b0e6b561683425fe865720970ce60d45ecec11e4 # timeout=10
Commit message: "[BEAM-14351] Inherit from Coder. (#17437)"
 > git rev-list --no-walk bb5342507e77b040f5bb402aa3628a180f7bf71e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7465072747978766080.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0428111153 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v2aufztan3kae

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/687/display/redirect?page=changes>

Changes:

[msbukal] FhirIO: use .search() or .searchType instead of .setResourceType()

[nick.caballero] [BEAM-14363] Fixes WatermarkParameters builder for Kinesis

[noreply] Remove unnecessary decorator from RunInference interface (#17463)

[noreply] [BEAM-13590] Minor deprecated warning fix (#17453)

[noreply] [BEAM-12164]: fix the negative throughput issue (#17461)

[noreply] Updated goldens for the screen diff integration tests (#17467)

[noreply] fixes copy by value error for bytes.Buffer in Error (#17469)

[noreply] Merge pull request #17354 from [BEAM-14170] - Create a test that runs

[noreply] Merge pull request #17447 from [BEAM-14357] Fix

[noreply] [BEAM-14324, BEAM-14325] Staticcheck cleanup in test files (#17393)

[noreply] BEAM-14187 Fix NPE (#17454)

[noreply] [BEAM-11105] Stateful watermark estimation (#17374)

[noreply] [BEAM-14304] implement parquetio to read/write parquet files (#17347)

[noreply] [BEAM-11104] Add Checkpointing split to Go SDK (#17386)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision bb5342507e77b040f5bb402aa3628a180f7bf71e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bb5342507e77b040f5bb402aa3628a180f7bf71e # timeout=10
Commit message: "[BEAM-11104] Add Checkpointing split to Go SDK (#17386)"
 > git rev-list --no-walk 07f30d221e4b285b23b74c3509d77b62388b7bb4 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1414584243313694531.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0427154008 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nogge7jgz3ljy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/686/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14343] Allow expansion service override in ExternalPythonTransform

[Heejong Lee] update

[Heejong Lee] allows remote host

[Heejong Lee] improve compatibility with python rowcoder

[ahmedabualsaud] added tempLocation to test pipeline options

[ahmedabualsaud] using tempRoot for temp bucket location

[ahmedabualsaud] small fixes

[noreply] [BEAM-14320] Update programming-guide w/Java GroupByKey example (#17369)

[noreply] Minor: Fix release script for `current` symlinks (#17457)

[noreply] Minor: fix typo (#17452)

[noreply] Change return type for PytorchInferenceRunner (#17460)

[noreply] [BEAM-13608] JmsIO dynamic topics feature (#17163)

[Heejong Lee] add test


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 07f30d221e4b285b23b74c3509d77b62388b7bb4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 07f30d221e4b285b23b74c3509d77b62388b7bb4 # timeout=10
Commit message: "Merge pull request #17418 from ihji/BEAM-14343"
 > git rev-list --no-walk 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2138889160257108013.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0426150542 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wdwuqmwavn4a2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/685/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
Commit message: "[BEAM-13953] added documentation for BQ Storage Write API (#17391)"
 > git rev-list --no-walk 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2971211683820978142.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0425150553 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dkbniaeygk6xm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/684/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13953] added documentation for BQ Storage Write API (#17391)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3f2e3c7c9eccb9d40370cbc70e9a451a4b5573f5 # timeout=10
Commit message: "[BEAM-13953] added documentation for BQ Storage Write API (#17391)"
 > git rev-list --no-walk 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins3630225432436643427.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0424150518 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/u64xlap3bpvto

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/683/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-14321] SQL passes Null for Null aggregates

[noreply] Create apache-hop-with-dataflow.md

[noreply] Add files via upload

[noreply] Delete website/www/site/content/en/blog/apache-hop-with-dataflow

[noreply] Add files via upload

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[noreply] Update apache-hop-with-dataflow.md

[danielamartinmtz] Moved up get-credentials instruction for getting the kubeconfig file

[noreply] Merge pull request #17428: [BEAM-14326] Make sure BigQuery daemon thread

[noreply] [BEAM-14301] Add lint:ignore to noescape() func (#17355)

[noreply] [BEAM-14286] Remove unused vars in harness package (#17392)

[noreply] [BEAM-14327] Convert Results to QueryResults directly (#17398)

[noreply] [BEAM-14302] Simplify boolean check in fn.go (#17399)

[noreply] [BEAM-13983] Sklearn Loader for RunInference (#17368)

[noreply] Update authors.yml

[noreply] [BEAM-14358] add retry to connect to testcontainer (#17449)

[noreply] [BEAM-13106] Bump flink docs to 1.14 (#17430)


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2c18ce0ccd7705473aa9ecc443dcdbe223dd9449 # timeout=10
Commit message: "[BEAM-13106] Bump flink docs to 1.14 (#17430)"
 > git rev-list --no-walk 1540b9dccc714d242a51929eac20ced06b1108eb # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins4337040279559543750.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0423150511 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dlwr3gij6xngk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/682/display/redirect?page=changes>

Changes:

[mmack] [BEAM-14345] Force paranamer 2.8 for Spark Hadoop version tests to avoid

[kamil.bregula] Revert "[BEAM-14300] Fix Java precommit failure"

[kamil.bregula] Revert "Merge pull request #17223 from [BEAM-14215] Improve argument

[noreply] [BEAM-13984] Implement RunInference for PyTorch (#17196)

[noreply] [BEAM-13945] add json type support for java bigquery connector (#17209)

[Andrew Pilloud] [BEAM-14348] Upgrade to ZetaSQL 2022.04.1

[Andrew Pilloud] [BEAM-13735] Enable ZetaSQL tests for Java 17

[noreply] [BEAM-14346] Fix incorrect error case index in ret2() (#17425)

[noreply] [BEAM-14342] Fix wrong default buffer type in fn_runner (#17420)

[noreply] Updates opencensus-api dependency to the latest version - 0.31.0

[noreply] [BEAM-14306] Add unit testing to pane coder (#17370)

[noreply] Updated the dep and golden for screen diff integration tests (#17442)

[noreply] [BEAM-13657] Add python 3.6 update to CHANGES.md (#17435)

[noreply] Merge pull request #17438: [BEAM-8127] The GCP module to declare


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1540b9dccc714d242a51929eac20ced06b1108eb (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1540b9dccc714d242a51929eac20ced06b1108eb # timeout=10
Commit message: "Merge pull request #17434: [BEAM-14348] Upgrade to ZetaSQL 2022.04.1"
 > git rev-list --no-walk 373c1c9cb96d77220494b6dbfb1467704639e700 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1812882945580531709.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0422150524 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qg64k25g4aymk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/681/display/redirect?page=changes>

Changes:

[vachan] Annotating Read API tests.

[bulat.safiullin] [BEAM-14247] [Website] add image

[bulat.safiullin] [BEAM-14247] [Website] center image

[mattcasters] BEAM-1857 : CHANGES.md entry for 2.38.0

[mmack] [BEAM-14335] Spotless Spark sources

[noreply] [BEAM-14112] Fixed ReadFromBigQuery with Interactive Beam (#17306)

[noreply] Update .asf.yaml (#17409)

[noreply] [BEAM-14336] Sickbay flight delays test - dataset seems to be missing

[noreply] [BEAM-14338] Update watermark unit tests to use time.Time.Equals()

[noreply] [BEAM-14328] Tweaks to "Differences from pandas" page (#17413)

[Andrew Pilloud] [BEAM-14253] Disable broken test pending Dataflow fix

[yiru] fix: BigQuery Storage Connector trace id population missing bracket

[noreply] [BEAM-14330] Temporarily disable the clusters auto-cleanup (#17400)

[noreply] Update Beam website to release 2.38.0 (#17378)

[noreply] [BEAM-14213] Add API and construction time validation for Batched DoFns

[noreply] Minor: Update release guide regarding archive.apache.org (#17419)

[noreply] [BEAM-14017] beam_PreCommit_CommunityMetrics_Cron test failing (#17396)

[noreply] BEAM-13582 Fixing broken links in the documentation (#17300)

[noreply] [BEAM-13657] Sunset python 3.6 (#17252)

[noreply] Removes unsupported Python 3.6 from the release validation script


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 373c1c9cb96d77220494b6dbfb1467704639e700 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 373c1c9cb96d77220494b6dbfb1467704639e700 # timeout=10
Commit message: "Removes unsupported Python 3.6 from the release validation script (#17397)"
 > git rev-list --no-walk e4d2050ccbaafb90428ab6c0cc494039f6282dae # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1122035712751130889.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0421153032 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 21s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cuo2vkswxb2b2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/680/display/redirect?page=changes>

Changes:

[andyye333] Change func to PTransform

[noreply] Populate actual dataflow job id to bigquery write trace id (#17130)

[relax] mark static thread as a daemon thread

[noreply] [BEAM-13866] Add miscellaneous exec unit tests (#17363)

[mmack] [BEAM-14323] Improve IDE integration of Spark cross version builds


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e4d2050ccbaafb90428ab6c0cc494039f6282dae (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e4d2050ccbaafb90428ab6c0cc494039f6282dae # timeout=10
Commit message: "Merge pull request #17389: [BEAM-14323] Improve IDE integration of Spark cross version builds"
 > git rev-list --no-walk 4b709d5456b105ffcc251da7a0a4a0b560491b1c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins7197999154613109409.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0420150519 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/v4kgbtrzo5nuo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #679

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/679/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14251] add output_coder_override to ExpansionRequest

[Heejong Lee] remove null

[rarokni] [BEAM-14307] Fix Slow Side input pattern bug in sample

[Heejong Lee] better error msg

[Heejong Lee] update from comments

[noreply] [BEAM-14316] Introducing KafkaIO.Read implementation compatibility

[noreply] [BEAM-14290] Address staticcheck warnings in the reflectx package

[noreply] [BEAM-14302] Simply bools in fn.go, genx_test.go (#17356)

[noreply] Merge pull request #17382: [BEAM-12356] Close DatasetService leak as


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4b709d5456b105ffcc251da7a0a4a0b560491b1c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b709d5456b105ffcc251da7a0a4a0b560491b1c # timeout=10
Commit message: "Merge pull request #17382: [BEAM-12356] Close DatasetService leak as local variables"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1816320938704738309.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0419150559 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nowldasd6iea6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #678

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/678/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins2054455528840450042.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0418150524 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mgaulp2r5ens4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #677

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/677/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins5278740893319933529.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0417150514 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cfutlnurqnmmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #676

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/676/display/redirect?page=changes>

Changes:

[pandiana] BigQueryServicesImpl: reduce number of threads spawned by

[noreply] [BEAM-13204] Fix website bug where code tabs do not appear if the


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision cff9ccd86b390d8e5edfaa850fcf132da178330e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f cff9ccd86b390d8e5edfaa850fcf132da178330e # timeout=10
Commit message: "[BEAM-13204] Fix website bug where code tabs do not appear if the default language is not available (#17379)"
 > git rev-list --no-walk ddd95c53738133fbb314cf9ba0ddd457774cfe28 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins1915856738666970651.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0416150525 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/j4ljxrfqpnooy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/675/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Upgrade to Gradle 7.4

[Kenneth Knowles] Remove Python module dependency on Dataflow worker

[noreply] [BEAM-11104] Pipe Continuation to DataSource level (#17334)

[noreply] [BEAM-11105] Basic Watermark Estimation (Wall Clock Observing) (#17267)

[noreply] Respect output coder for TextIO. (#17367)

[noreply] Merge pull request #17200 from [BEAM-12164]: fix the autoscaling backlog

[noreply] [BEAM-17035] Call python3 directly when it is available. (#17366)

[noreply] Merge pull request #17375: [BEAM-8691] Declare newer


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ddd95c53738133fbb314cf9ba0ddd457774cfe28 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ddd95c53738133fbb314cf9ba0ddd457774cfe28 # timeout=10
Commit message: "Merge pull request #17375: [BEAM-8691] Declare newer google-cloud-bigtable explicitly"
 > git rev-list --no-walk df6efe3644d08bda747d9d4434ab9e033073c8de # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_PubsubIOIT_Python_Streaming] $ /bin/bash -xe /tmp/jenkins6460769183051466021.sh
+ echo '*** PubsubIO Write Performance Test Python 2GB ***'
*** PubsubIO Write Performance Test Python 2GB ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/gradlew> -PloadTest.mainClass=apache_beam.io.gcp.pubsub_io_perf_test -Prunner=TestDataflowRunner '-PloadTest.args=--job_name=performance-tests-psio-python-2gb0415150512 --project=apache-beam-testing --region=us-central1 --temp_location=gs://temp-storage-for-perf-tests/loadtests --publish_to_big_query=true --metrics_dataset=beam_performance --metrics_table=psio_io_2GB_results --influx_measurement=python_psio_2GB_results --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --input_options='{"num_records": 2097152,"key_size": 1,"value_size": 1024}' --num_****s=5 --autoscaling_algorithm=NONE --pubsub_namespace_prefix=pubsub_io_performance_ --wait_until_finish_duration=600000 --runner=TestDataflowRunner' -PwithDataflowWorkerJar=true -PpythonVersion=3.7 --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:python:apache_beam:testing:load_tests:run
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 49

* What went wrong:
A problem occurred evaluating project ':sdks:python:apache_beam:testing:load_tests'.
> Could not get unknown property 'shadowJar' for project ':runners:google-cloud-dataflow-java:****' of type org.gradle.api.Project.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/o6c7eq3vamzfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #674

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/674/display/redirect?page=changes>

Changes:

[relax] handle changing schemas in Storage API sink

[noreply] Fix a couple style issues (#17361)

[noreply] [BEAM-14287] Clean up staticcheck warnings in graph/coder (#17337)

[noreply] Improvements to dataflow job service for non-Python jobs. (#17338)

[noreply] Bump minimist (#17290)

[noreply] Bump ansi-regex (#17291)

[noreply] Bump nanoid (#17292)

[noreply] Bump lodash (#17293)

[noreply] Bump url-parse (#17294)

[noreply] Bump moment (#17328)

[noreply] Merge pull request #15549 from [BEAM-11997] Changed RedisIO

[noreply] [BEAM-13925] Dont double assign committers if author or other reviewer

[noreply] [BEAM-13739] Remove deprecated shallow clone funcs (#17362)


------------------------------------------
[...truncated 55.89 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2729074 sha256=a0c96d986a30e55684d11de5af879c2c4c7c93bfcb4c6c028f0f0e1610e609f5
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.40 botocore-1.24.40 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649951446.409368/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220414155046410312-5521'
 createTime: '2022-04-14T15:50:52.458355Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-14_08_50_52-15213106085886930386'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0414150605'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-14T15:50:52.458355Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-14_08_50_52-15213106085886930386]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-14_08_50_52-15213106085886930386
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_08_50_52-15213106085886930386?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-14_08_50_52-15213106085886930386 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:57.151Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.138Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.167Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.223Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.256Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.284Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.336Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.420Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.456Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.485Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.517Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.556Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.657Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.798Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.827Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.857Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.891Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.921Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:58.977Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:59.014Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:50:59.046Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:20.765Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:38.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T15:51:59.578Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-14_08_50_52-15213106085886930386 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: df1d8c646fd643498988c61d602c9bb1 and timestamp: 1649952139.076525:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 104
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0414150605.1649952143.584022/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220414160223584946-4315'
 createTime: '2022-04-14T16:02:30.336067Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-14_09_02_29-17762620975207896110'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0414150605'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-14T16:02:30.336067Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-14_09_02_29-17762620975207896110]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-14_09_02_29-17762620975207896110
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_09_02_29-17762620975207896110?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-14_09_02_29-17762620975207896110 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:35.344Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.381Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.405Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.473Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.533Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.562Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.613Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.674Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.723Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.751Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.844Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.913Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.943Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:37.979Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.027Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.121Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.158Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.195Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.228Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.252Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.279Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.314Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.370Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.409Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:38.429Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:02:51.224Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:03:19.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-14T16:03:43.517Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-14_09_02_29-17762620975207896110 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_128607c9-fe5a-4b6c-a047-be8442e7fb95_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_08_50_52-15213106085886930386?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-14_09_02_29-17762620975207896110?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_128607c9-fe5a-4b6c-a047-be8442e7fb95_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 39m 40s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vvxue47quoppw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/673/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Add remaining Dataflow test suites for Python 3.9.

[Heejong Lee] [BEAM-14232] Only resolve artifacts in expanded environments for Java

[noreply] Fix test ordering issue (#17350)

[buqian] Do not pass null to MoreObjects.firstNonNull as default value

[ningkang0957] [BEAM-14288] Fixed flaky test

[noreply] [BEAM-14277] Disables Spanner change streams tests (#17346)

[noreply] [BEAM-14219] Run cleanup script to remove stale prebuilt SDK container

[Heejong Lee] [BEAM-14300] Fix Java precommit failure

[noreply] [BEAM-14116] Rollback "Chunk commit requests dynamically (#17004)"

[noreply] [BEAM-13982] A base class for run inference (#16970)

[ningkang0957] Enumerates all possible expected strings when asserting

[noreply] [BEAM-13966] Add pivot(), a non-deferred column operation on categorical


------------------------------------------
[...truncated 55.07 KB...]
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2728896 sha256=dfc0027726295999ba4c9a3cf1fc942d51fd05f0c088dddb214ceee856a81eed
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.39 botocore-1.24.39 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865045.142366/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220413155045143512-1472'
 createTime: '2022-04-13T15:50:51.895646Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-13_08_50_51-2750334581216619093'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0413150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-13T15:50:51.895646Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-13_08_50_51-2750334581216619093]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-13_08_50_51-2750334581216619093
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_08_50_51-2750334581216619093?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-13_08_50_51-2750334581216619093 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:55.560Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.039Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.092Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.157Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.185Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.211Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.232Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.257Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.283Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.309Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.354Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.380Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.401Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.452Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.476Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.558Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.577Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.602Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.623Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.644Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.702Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.729Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:50:57.754Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:51:27.402Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:51:41.071Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T15:52:03.261Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.294Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.349Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.384Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.422Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:01:02.455Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-13_08_50_51-2750334581216619093 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fb514593573b4726b04aa514c6470ba2 and timestamp: 1649865728.5488813:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 189
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0413150514.1649865733.629359/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220413160213630288-4533'
 createTime: '2022-04-13T16:02:22.919929Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-13_09_02_21-534198698279891282'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0413150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-13T16:02:22.919929Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-13_09_02_21-534198698279891282]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-13_09_02_21-534198698279891282
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_09_02_21-534198698279891282?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-13_09_02_21-534198698279891282 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:30.922Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:36.668Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:41.688Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:41.847Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.072Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.119Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.174Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.234Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.262Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.440Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.590Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.618Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.673Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.707Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.772Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.895Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.919Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.951Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:42.985Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.018Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.077Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.105Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:43.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:02:59.225Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:17.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:17.145Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:27.489Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-13T16:03:52.257Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-13_09_02_21-534198698279891282 after 605 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0b154c4d4f7140a28ac6bc7e8f92f740 and timestamp: 1649866539.4664998:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0b154c4d4f7140a28ac6bc7e8f92f740 and timestamp: 1649866539.4664998:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_08_50_51-2750334581216619093?project=apache-beam-testing
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-13_09_02_21-534198698279891282?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4e39f9c6-0ca3-4d99-bb55-ef5e1ceba75a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 15s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rnm5ohj7u7qrk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/672/display/redirect?page=changes>

Changes:

[kamil.bregula] [BEAM-14215] Improve argument validation in SnowflakeIO

[benjamin.gonzalez] [BEAM-14013] Add PreCommit Kotlin examples Jenkins Job

[Andrew Pilloud] [BEAM-13151] Support multiple layers of AutoValue nesting

[Heejong Lee] [BEAM-14233] Merge requirements from expanded response for Java External

[benjamin.gonzalez] [BEAM-14013] Add spark, direct, flink runners as triggers for Kotlin

[noreply] [BEAM-13898] Add tests to the pubsubx package. (#17324)

[noreply] [BEAM-14285] Clean up Staticcheck Warnings in io packages (#17336)

[noreply] [BEAM-14187] Fix concurrency issue in IsmReaderImpl (#17201)

[noreply] [BEAM-14288] Skip flaking test

[noreply] Simplify specifying additional dependencies in Go SDK in XLang IOs

[noreply] [BEAM-14240] Clean staticcheck warnings in runner packages (#17340)

[Daniel Oliveira] [BEAM-13538] Workaround to fix go-licenses crash.


------------------------------------------
[...truncated 55.11 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726953 sha256=185aece9c53fd7ac8c3aa5b1fea602c78bdad50a05199c45edc46d31f264cf10
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.38 botocore-1.24.38 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649778645.423643/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220412155045424579-7183'
 createTime: '2022-04-12T15:50:51.433400Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-12_08_50_51-8594613633872340345'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0412150517'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-12T15:50:51.433400Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-12_08_50_51-8594613633872340345]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-12_08_50_51-8594613633872340345
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_08_50_51-8594613633872340345?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-12_08_50_51-8594613633872340345 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:55.447Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.324Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.355Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.425Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.521Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.552Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.586Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.621Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.650Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.678Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.739Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.767Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.790Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.848Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:56.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.005Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.040Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.074Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.099Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.129Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.186Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.214Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:50:57.254Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:51:16.584Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:51:37.210Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T15:52:03.175Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-12_08_50_51-8594613633872340345 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a121ca77f114458e82a2bcaf1975a8c0 and timestamp: 1649779348.0970185:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 184
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0412150517.1649779353.300132/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220412160233301081-2366'
 createTime: '2022-04-12T16:02:40.177974Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-12_09_02_39-9535075498141607387'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0412150517'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-12T16:02:40.177974Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-12_09_02_39-9535075498141607387]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-12_09_02_39-9535075498141607387
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_09_02_39-9535075498141607387?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-12_09_02_39-9535075498141607387 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:44.728Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.495Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.528Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.593Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.676Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.704Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.769Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.848Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.889Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.962Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:46.994Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.030Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.293Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.383Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.419Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.449Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.477Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.504Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.535Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.589Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.613Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:47.662Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:02:57.408Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:03:28.185Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-12T16:03:55.397Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-12_09_02_39-9535075498141607387 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f15da87fb515482684796ad5aab99ec5 and timestamp: 1649780101.4088836:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 140
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f15da87fb515482684796ad5aab99ec5 and timestamp: 1649780101.4088836:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 140
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f3a9d77c-c0d9-43de-98cf-201544891df0_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_08_50_51-8594613633872340345?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-12_09_02_39-9535075498141607387?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 35s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bhwkz3sa73kgi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/671/display/redirect>

Changes:


------------------------------------------
[...truncated 55.04 KB...]
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726931 sha256=e4550d2ac440bcf3501694ebbd1f58ae8abc76b1f921f51c412b12bddf33cb40
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692239.456186/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220411155039457170-4956'
 createTime: '2022-04-11T15:50:45.622571Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-11_08_50_45-3037064181937316376'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0411150533'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-11T15:50:45.622571Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-11_08_50_45-3037064181937316376]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-11_08_50_45-3037064181937316376
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_08_50_45-3037064181937316376?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-11_08_50_45-3037064181937316376 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:50:57.526Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.303Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.333Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.404Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.444Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.477Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.506Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.530Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.562Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.598Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.630Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.666Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.699Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.726Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.757Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.789Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.880Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.911Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.958Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:00.990Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.016Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.078Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.114Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:01.179Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:32.754Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:32.787Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:35.908Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:51:43.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T15:52:06.136Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-11_08_50_45-3037064181937316376 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8a48432880d2456c94042c3f37ad87f8 and timestamp: 1649692962.4810255:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 241
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0411150533.1649692966.624122/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220411160246625118-7269'
 createTime: '2022-04-11T16:02:52.629564Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-11_09_02_52-1350288650734643885'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0411150533'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-11T16:02:52.629564Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-11_09_02_52-1350288650734643885]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-11_09_02_52-1350288650734643885
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_09_02_52-1350288650734643885?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-11_09_02_52-1350288650734643885 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:02:58.432Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.174Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.221Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.305Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.375Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.403Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.468Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.534Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.585Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.933Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:00.967Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.032Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.103Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.143Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.171Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.203Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.235Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.290Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.318Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:01.388Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:09.104Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:03:46.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-11T16:04:13.930Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-11_09_02_52-1350288650734643885 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1997222e97c54425ae29e2a9aa719b0b and timestamp: 1649693760.147334:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 291
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1997222e97c54425ae29e2a9aa719b0b and timestamp: 1649693760.147334:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 291
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_08_50_45-3037064181937316376?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-11_09_02_52-1350288650734643885?project=apache-beam-testing
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_27430dfb-c522-4e8c-b409-ddad1549061e_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/l4hvctg4lcq6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/670/display/redirect?page=changes>

Changes:

[chamikaramj] Re-raise exceptions swallowed in several Python I/O connectors

[noreply] Merge pull request #16928: [BEAM-11971] Re add reverted timer


------------------------------------------
[...truncated 55.51 KB...]
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726931 sha256=410e4deeb053292f19fd7f62d9eb961c68144c2fcbd290f8ea58f6589ace5c83
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.8
    Uninstalling pyparsing-3.0.8:
      Successfully uninstalled pyparsing-3.0.8
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649605846.960539/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220410155046961528-7352'
 createTime: '2022-04-10T15:50:53.396748Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-10_08_50_52-2743639274288009278'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0410150552'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-10T15:50:53.396748Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-10_08_50_52-2743639274288009278]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-10_08_50_52-2743639274288009278
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_08_50_52-2743639274288009278?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_08_50_52-2743639274288009278 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:50:59.352Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.620Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.652Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.720Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.754Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.786Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.831Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.865Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.907Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.963Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:02.997Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.048Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.076Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.108Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.224Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.253Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.288Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.316Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.348Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.407Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.444Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:03.473Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:11.693Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:35.569Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:35.596Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:51:45.899Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:52:07.294Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.298Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.370Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.394Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.426Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T15:59:16.464Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:28.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:28.996Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:00:29.030Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_08_50_52-2743639274288009278 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fb4eabb67dc54e02b23ee2dcc5f4dc1e and timestamp: 1649606439.0959966:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 73
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0410150552.1649606443.271144/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220410160043272047-7201'
 createTime: '2022-04-10T16:00:49.823325Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-10_09_00_49-9968687670329101209'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0410150552'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-10T16:00:49.823325Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-10_09_00_49-9968687670329101209]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-10_09_00_49-9968687670329101209
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_09_00_49-9968687670329101209?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-10_09_00_49-9968687670329101209 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:01Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:06.883Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:06.937Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.013Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.093Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.123Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.188Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.255Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.294Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.331Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.363Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.394Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.427Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.505Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.538Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.570Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.637Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.669Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.753Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.795Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.825Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.885Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.922Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:07.955Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.012Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.043Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:08.092Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:20.624Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:38.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:38.548Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:01:48.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-10T16:02:12.988Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-10_09_00_49-9968687670329101209 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c5499bc635894d8da4716e2c8da10c57 and timestamp: 1649607289.397949:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 371
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c5499bc635894d8da4716e2c8da10c57 and timestamp: 1649607289.397949:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 371
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_08_50_52-2743639274288009278?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-10_09_00_49-9968687670329101209?project=apache-beam-testing
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_1e8e0ee6-f248-470b-9211-a4401e668858_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 25s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xpodygs2cqhcm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/669/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-11714] Change spotBugs jenkins config

[Robert Bradshaw] Cleanup docs on Shared.

[Kyle Weaver] Nit: correct description for precommit cron jobs.

[benjamin.gonzalez] [BEAM-11714] Add dummy class for testing

[benjamin.gonzalez] [BEAM-11714] Remove dummy class used for testing

[benjamin.gonzalez] [BEAM-11714] Spotbugs print toJenkins UI precommit_Java17

[noreply] [BEAM-13767] Remove eclipse plugin as it generates a lot of unused tasks

[noreply] [BEAM-10708] Updated beam_sql error message (#17314)

[noreply] [BEAM-14281] add as_deterministic_coder to nullable coder (#17322)

[noreply] Improvements to Beam/Spark quickstart. (#17129)

[chamikaramj] Disable BigQueryIOStorageWriteIT for Runner v2 test suite


------------------------------------------
[...truncated 54.50 KB...]
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726702 sha256=cf92c79df76bd217d7325b77dd1a7abe4bc3b1d043954d5bbd09d97eccb7b639
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.37 botocore-1.24.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649519434.275398/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220409155034276473-4613'
 createTime: '2022-04-09T15:50:40.630870Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-09_08_50_40-7816878964191486596'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0409150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-09T15:50:40.630870Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-09_08_50_40-7816878964191486596]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-09_08_50_40-7816878964191486596
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_08_50_40-7816878964191486596?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-09_08_50_40-7816878964191486596 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:44.357Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.126Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.156Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.223Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.256Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.286Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.315Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.349Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.389Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.428Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.497Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.523Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.555Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.643Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.759Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.798Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.828Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.863Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.896Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.954Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:45.978Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:50:46.008Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:20.829Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:31.153Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T15:51:59.634Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-09_08_50_40-7816878964191486596 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c3e164cdace44842aa0deadd34dbcfe6 and timestamp: 1649520218.2549164:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0409150512.1649520222.016633/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220409160342017577-5352'
 createTime: '2022-04-09T16:03:48.094291Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-09_09_03_47-1709887723878028'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0409150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-09T16:03:48.094291Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-09_09_03_47-1709887723878028]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-09_09_03_47-1709887723878028
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_09_03_47-1709887723878028?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-09_09_03_47-1709887723878028 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:53.319Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.385Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.425Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.489Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.542Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.570Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.637Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.704Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.735Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.784Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.910Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.952Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:54.984Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.049Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.072Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.116Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.147Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.180Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.211Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.240Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.263Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.343Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.377Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:03:55.421Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:23.579Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:32.755Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:32.783Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:04:43.054Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-09T16:05:03.386Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-09_09_03_47-1709887723878028 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c2056643112841679d19c24be368e29e and timestamp: 1649521110.2676654:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 405
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c2056643112841679d19c24be368e29e and timestamp: 1649521110.2676654:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 405
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_08_50_40-7816878964191486596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-09_09_03_47-1709887723878028?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_c22b07a0-fc3d-40f7-b6aa-56476efdd263_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 5s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/53vjzbdlcaqno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/668/display/redirect?page=changes>

Changes:

[johnjcasey] [BEAM-10529] add java and generic components of nullable xlang tests

[johnjcasey] [BEAM-10529] fix test case

[johnjcasey] [BEAM-10529] add coders and typehints to support nullable xlang coders

[johnjcasey] [BEAM-10529] update external builder to support nullable coder

[johnjcasey] [BEAM-10529] clean up coders.py

[johnjcasey] [BEAM-10529] add coder translation test

[johnjcasey] [BEAM-10529] add additional check to typecoder to not accidentally

[johnjcasey] [BEAM-10529] add test to retrieve nullable coder from typehint

[johnjcasey] [BEAM-10529] run spotless

[johnjcasey] [BEAM-10529] add go nullable coder

[johnjcasey] [BEAM-10529] cleanup extra println

[johnjcasey] [BEAM-10529] improve comments, clean up python

[bulat.safiullin] [BEAM-13992] [Website] update Contribute/Code Contribution Guide page

[bulat.safiullin] [BEAM-13992] [Website] change text, transfer tag a

[bulat.safiullin] [BEAM-13992] [Website] change code tags

[bulat.safiullin] [BEAM-13992] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] change text and links, add empty lines

[bulat.safiullin] [BEAM-13991] [Website] change links, add contribute file

[bulat.safiullin] [BEAM-13991] [Website] add content, add styles

[bulat.safiullin] [BEAM-13991] [Website] add images, add styles, delete spaces

[bulat.safiullin] [BEAM-13991] [Website] change url and aliases, delete bullet points

[bulat.safiullin] [BEAM-13991] [Website] add empty line

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13992] [Website] change links, add text, add dots

[bulat.safiullin] [BEAM-13992] [Website] change links, change text

[bulat.safiullin] [BEAM-13991] [Website] change styles, change quotes

[bulat.safiullin] [BEAM-13991] [Website] change link color

[bulat.safiullin] [BEAM-13992] [Website] change text, delete whitespace

[bulat.safiullin] [BEAM-13991] [Website] change text

[bulat.safiullin] [BEAM-13992] [Website] update text

[bulat.safiullin] [BEAM-13991] [Website] added changes from PR 13992, changed get-starting

[shivrajw] [BEAM-14236] Parquet IO support for list to conform with Apache Parquet

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[chamikaramj] Sets 'sdk_harness_container_images' property for all Dataflow jobs -

[mmack] [BEAM-14104] Support shard aware aggregation in Kinesis writer.

[noreply] [BEAM-11745] Fix author list rendering (#17308)

[noreply] [BEAM-14144] Record JFR profiles when GC thrashing is detected (#17151)

[noreply] Factors enable_prime flag in when checking use_unified_worker conditions

[noreply] [BEAM-11104] Add ProcessContinuation type to Go SDK (#17265)

[noreply] BEAM-13939: Restructure Protos to fix namespace conflicts (#16961)

[noreply] [BEAM-14270] Mark {Snowflake/BigQuery}Services as @Internal (#17309)

[noreply] [BEAM-13901] Add unit tests for graphx/cogbk.go

[noreply] [BEAM-14259, BEAM-14266] Remove unused function, replace use of ptypes

[noreply] [BEAM-14274] Fix staticcheck warnings in pipelinex (#17311)

[noreply] [BEAM-13857] Switched Go IT script to using Go flags for expansion

[noreply] Update python beam-master container image. (#17313)


------------------------------------------
[...truncated 54.61 KB...]
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-api-core[grpc,grpcgcp]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.7.0-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.1-py3-none-any.whl (114 kB)
  Using cached google_api_core-2.6.0-py2.py3-none-any.whl (114 kB)
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
  Using cached google_api_core-2.3.2-py2.py3-none-any.whl (109 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2726519 sha256=ccae4ff4c259122dd69cc421faa78fbaa098424171047aeccd03519080710d73
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.36 botocore-1.24.36 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.2 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433030.530072/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220408155030531019-1817'
 createTime: '2022-04-08T15:50:37.351243Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-08_08_50_36-9069213566820303836'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0408150541'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-08T15:50:37.351243Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-08_08_50_36-9069213566820303836]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-08_08_50_36-9069213566820303836
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_08_50_36-9069213566820303836?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-08_08_50_36-9069213566820303836 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:42.826Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.760Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.794Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.845Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.876Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.905Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.944Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:44.971Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.021Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.117Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.153Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.443Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.480Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.506Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.553Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.586Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.636Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.660Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:45.692Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:50:53.228Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:51:30.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:51:51.645Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.226Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.306Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.334Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.364Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T15:59:59.398Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-08_08_50_36-9069213566820303836 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ac1e2554b8aa429090da43338d976e25 and timestamp: 1649433663.517995:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 81
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220407" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0408150541.1649433668.041929/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220408160108042854-7262'
 createTime: '2022-04-08T16:01:15.888663Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-08_09_01_14-10743278955627327941'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0408150541'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-08T16:01:15.888663Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-08_09_01_14-10743278955627327941]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-08_09_01_14-10743278955627327941
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_09_01_14-10743278955627327941?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-08_09_01_14-10743278955627327941 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:26.902Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.541Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.605Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.685Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.729Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.793Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.867Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.916Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.954Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:32.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.112Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.181Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.405Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.434Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.455Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.488Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.519Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.573Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.604Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:33.655Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:01:49.915Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:02:20.454Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-08T16:02:42.919Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-08_09_01_14-10743278955627327941 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4ada92d1ad434f32b40e2acb4b56f6f9 and timestamp: 1649434380.8059208:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 232
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4ada92d1ad434f32b40e2acb4b56f6f9 and timestamp: 1649434380.8059208:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 232
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_08_50_36-9069213566820303836?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-08_09_01_14-10743278955627327941?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_194dffaf-02f7-436e-8da4-e5104d6b0ee3_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 37s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qthdgep6cb3ey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/667/display/redirect?page=changes>

Changes:

[noreply] Avoid pr-bot state desync (#17299)

[noreply] [BEAM-14259] Clean up staticcheck warnings in the exec package (#17285)

[noreply] Minor: Prefer registered schema in SQL docs (#17298)

[Kyle Weaver] [BEAM-14262] Update plugins for Dockerized Jenkins.

[Kyle Weaver] Add ansicolor and ws-cleanup plugins.

[noreply] [Playground] add meta tags (#17207)

[noreply] fixes golint and deprecated issues in recent Go SDK import (#17304)

[noreply] [BEAM-14266] Replace deprecated ptypes package uses (#17302)

[noreply] [BEAM-11936] Fix rawtypes warnings in SnowflakeIO (#17257)

[noreply] Merge pull request #17262: [BEAM-14244] Use the supplied output

[noreply] [BEAM-13015] Lookup the container for the step once when registering

[noreply] [BEAM-14175] Log read loop abort at debug rather than error (#17183)


------------------------------------------
[...truncated 51.71 KB...]
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712736 sha256=48052be426da032e4b2a5f1810a85e470a2b510ec0a7c973fc36ecd2e03847ce
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.35 botocore-1.24.35 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.12.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.2 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:java:harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649346816.509315/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220407155336510257-9757'
 createTime: '2022-04-07T15:53:43.410193Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-07_08_53_42-5342302405782967931'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0407150604'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-07T15:53:43.410193Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-07_08_53_42-5342302405782967931]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-07_08_53_42-5342302405782967931
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_08_53_42-5342302405782967931?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-07_08_53_42-5342302405782967931 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:48.394Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.539Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.572Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.634Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.661Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.706Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.740Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.788Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.828Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.954Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:53.979Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.025Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.054Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.182Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.212Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.238Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.258Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.291Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.341Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.374Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:53:54.914Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:54:26.684Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:54:34.696Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T15:55:00.505Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-07_08_53_42-5342302405782967931 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b8bdc2e1cd7c4924bbb76c5981b1e2f1 and timestamp: 1649347535.2959418:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 209
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0407150604.1649347541.525844/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220407160541526749-7201'
 createTime: '2022-04-07T16:05:47.608693Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-07_09_05_47-10043726317574425415'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0407150604'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-07T16:05:47.608693Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-07_09_05_47-10043726317574425415]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-07_09_05_47-10043726317574425415
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_09_05_47-10043726317574425415?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-07_09_05_47-10043726317574425415 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:54.565Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.263Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.292Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.374Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.462Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.490Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.558Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.626Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.666Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.694Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.727Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.795Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.826Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.860Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.898Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.921Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:56.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.055Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.099Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.202Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.270Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.316Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.357Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.389Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.415Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.460Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.488Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:05:57.532Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:06:22.354Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:06:42.107Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-07T16:07:09.049Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-07_09_05_47-10043726317574425415 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6b3d0a69224843ccabe1a27781c825e1 and timestamp: 1649348251.9933283:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6b3d0a69224843ccabe1a27781c825e1 and timestamp: 1649348251.9933283:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 93
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 1018, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f15f21f7-81f8-4443-876a-905a088f3495_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_08_53_42-5342302405782967931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-07_09_05_47-10043726317574425415?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 6s
92 actionable tasks: 60 executed, 30 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nflg3u6isig6s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/666/display/redirect?page=changes>

Changes:

[bingyeli] update query

[Robert Bradshaw] [BEAM-14250] Fix coder registration for types defined in __main__.

[johnjcasey] [BEAM-14256] update SpEL dependency to 5.3.18.RELEASE

[johnjcasey] [BEAM-14256] remove .RELEASE

[dannymccormick] Fix dependency issue causing failures

[Kyle Weaver] [BEAM-9649] Add region option to Mongo Dataflow test.

[noreply] Allow get_coder(None).

[noreply] [BEAM-13015] Disable retries for fnapi grpc channels which otherwise

[noreply] [BEAM-13952] Sickbay

[noreply] BEAM-14235 parquetio module does not parse PEP-440 compliant Pyarrow

[noreply] [Website] Contribution guide page indent bug fix (#17287)

[noreply] [BEAM-10976] Document go sdk bundle finalization (#17048)

[noreply] [BEAM-13829] Expose status API from Go SDK Harness (#16957)


------------------------------------------
[...truncated 55.05 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: sqlalchemy, apache-beam
  Building wheel for sqlalchemy (setup.py): started
  Building wheel for sqlalchemy (setup.py): finished with status 'done'
  Created wheel for sqlalchemy: filename=SQLAlchemy-1.4.35-cp37-cp37m-linux_x86_64.whl size=1599143 sha256=1946c268ef57ed5bd1a946663d728aefa88db45ce36c199740c4f67d3d3be92b
  Stored in directory: /home/jenkins/.cache/pip/wheels/47/4b/54/e232479cdb4834a9fab3e9b9b11edb77472957215b129b8406
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712736 sha256=54ea4f216a97d52a36346486cdde726033d297e750c79b257b0fb516b6e061ee
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built sqlalchemy apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.34 botocore-1.24.34 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.1 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.1 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.35 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260240.908256/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220406155040909269-7638'
 createTime: '2022-04-06T15:50:47.270007Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-06_08_50_46-15011121603976568263'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0406150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-06T15:50:47.270007Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-06_08_50_46-15011121603976568263]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-06_08_50_46-15011121603976568263
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_08_50_46-15011121603976568263?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-06_08_50_46-15011121603976568263 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:58.040Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.479Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.524Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.586Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.625Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.655Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.697Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.733Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.800Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.849Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.882Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.916Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.950Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:50:59.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.016Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.184Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.216Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.240Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.267Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.291Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.345Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.377Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:00.420Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:29.134Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:51:41.567Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T15:52:07.104Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-06_08_50_46-15011121603976568263 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: dd42e88ddddf42e28fce497b679faf9a and timestamp: 1649260943.952181:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 103
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0406150535.1649260949.135759/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220406160229136670-9391'
 createTime: '2022-04-06T16:02:35.583053Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-06_09_02_35-16822319082303818898'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0406150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-06T16:02:35.583053Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-06_09_02_35-16822319082303818898]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-06_09_02_35-16822319082303818898
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_09_02_35-16822319082303818898?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-06_09_02_35-16822319082303818898 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:46.214Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.105Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.157Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.234Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.302Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.334Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.395Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.440Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.481Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.507Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.571Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.646Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.678Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.711Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.785Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.847Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.925Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.950Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:47.981Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.009Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.030Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.064Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.117Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:02:48.183Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:07.285Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:31.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-06T16:03:58.075Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-06_09_02_35-16822319082303818898 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 58057af7156441f18992d2275c1c514e and timestamp: 1649261687.0942461:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 134
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 58057af7156441f18992d2275c1c514e and timestamp: 1649261687.0942461:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 134
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_cd87c33f-c69e-41ae-8814-228198d77b98_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_08_50_46-15011121603976568263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-06_09_02_35-16822319082303818898?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 20s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lop2jjqkh327g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/665/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-8970] Add docs to run wordcount example on portable Spark Runner

[Kiley Sok] Update python container version

[benjamin.gonzalez] [BEAM-8970] Add period to end of sentence

[Kyle Weaver] Add self-descriptive message for expected errors.

[noreply] Add --dataflowServiceOptions=enable_prime to useUnifiedWorker conditions

[noreply] [BEAM-10529] nullable xlang coder (#16923)

[noreply] Fix go fmt break in core/typex/special.go (#17266)

[noreply] [BEAM-5436] Add doc page on Go cross compilation. (#17256)

[noreply] Pr-bot Don't count all reviews as approvals (#17269)

[noreply] Fix postcommits (#17263)

[noreply] [BEAM-14241] Address staticcheck warnings in boot.go (#17264)

[noreply] [BEAM-14157] GrpcWindmillServer: Use stream specific boolean to do

[noreply] [BEAM-10582] Allow (and test) pyarrow 7 (#17229)

[noreply] [BEAM-13519] Solve race issues when the server responds with an error


------------------------------------------
[...truncated 55.76 KB...]
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2712594 sha256=a564f1648ab35075d545ff88dc43fedd27dc0b04bd2e2a02513bd07230327713
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.33 botocore-1.24.33 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.1 google-cloud-core-1.7.2 google-cloud-datastore-1.15.4 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.1 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.3 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649173836.392900/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220405155036393877-9762'
 createTime: '2022-04-05T15:50:43.218242Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-05_08_50_42-2460069620461278920'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0405150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-05T15:50:43.218242Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-05_08_50_42-2460069620461278920]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-05_08_50_42-2460069620461278920
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_08_50_42-2460069620461278920?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-05_08_50_42-2460069620461278920 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:51.879Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.670Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.716Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.786Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.817Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.853Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.885Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.910Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.948Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:52.977Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.004Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.030Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.285Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.315Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.345Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.379Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.417Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.493Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.520Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:50:53.567Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:51:20.619Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:51:41.407Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T15:52:08.086Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-05_08_50_42-2460069620461278920 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7ddc5d9bb77e4419bd538bc081280910 and timestamp: 1649174555.0291286:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 136
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220331" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0405150536.1649174559.004543/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220405160239005481-5339'
 createTime: '2022-04-05T16:02:46.650564Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-05_09_02_46-16565799279862875471'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0405150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-05T16:02:46.650564Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-05_09_02_46-16565799279862875471]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-05_09_02_46-16565799279862875471
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_09_02_46-16565799279862875471?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-05_09_02_46-16565799279862875471 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:52.563Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.809Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.853Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.920Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:56.983Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.011Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.080Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.145Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.176Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.246Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.308Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.334Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.368Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.470Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.573Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.605Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.641Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.672Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.704Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.727Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.760Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.819Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.851Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:02:57.882Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:03:21.333Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:03:43.383Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-05T16:04:08.920Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-05_09_02_46-16565799279862875471 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_0bcd7cd6-45e7-4bdf-b57b-fd0f809d765b_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_08_50_42-2460069620461278920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-05_09_02_46-16565799279862875471?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_0bcd7cd6-45e7-4bdf-b57b-fd0f809d765b_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 4s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6ftbldjn5q324

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/664/display/redirect>

Changes:


------------------------------------------
[...truncated 55.00 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=8d5adc790d971e130221a2964f18e320acab7fbea22efe76f23ca0bb892d3c9d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649087431.719604/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220404155031720560-5310'
 createTime: '2022-04-04T15:50:39.612948Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-04_08_50_39-12155189229686329970'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0404150543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-04T15:50:39.612948Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-04_08_50_39-12155189229686329970]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-04_08_50_39-12155189229686329970
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_08_50_39-12155189229686329970?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-04_08_50_39-12155189229686329970 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:43.533Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.815Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.852Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.905Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.940Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.962Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:44.983Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.004Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.046Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.075Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.103Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.131Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.190Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.225Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.416Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.448Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.481Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.514Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.537Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.594Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.631Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:50:45.678Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:18.936Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:19.518Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:19.538Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:29.732Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T15:51:53.386Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-04_08_50_39-12155189229686329970 after 605 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 486c6265e7744882bcb97f1d7151b22c and timestamp: 1649088216.9733398:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0404150543.1649088220.871698/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220404160340872588-2382'
 createTime: '2022-04-04T16:03:47.627415Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-04_09_03_46-10844960694172558554'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0404150543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-04T16:03:47.627415Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-04_09_03_46-10844960694172558554]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-04_09_03_46-10844960694172558554
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_09_03_46-10844960694172558554?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-04_09_03_46-10844960694172558554 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:52.889Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.647Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.671Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.731Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.792Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.821Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.917Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:53.965Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.016Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.110Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.133Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.155Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.181Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.245Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.319Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.350Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.384Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.418Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.453Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.474Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.501Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.538Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.572Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.613Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.635Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:03:54.677Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:08.710Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:25.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:25.247Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:04:35.478Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-04T16:05:00.003Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-04_09_03_46-10844960694172558554 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e353fda9025d4b5fb11787dd6f257286 and timestamp: 1649089122.3941875:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 253
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e353fda9025d4b5fb11787dd6f257286 and timestamp: 1649089122.3941875:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 253
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_08_50_39-12155189229686329970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-04_09_03_46-10844960694172558554?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f5cbd081-9d14-4c4e-bff1-e2dfce2a28fa_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 21s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5q5rr72fizs2y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/663/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14143] Simplifies the ExternalPythonTransform API (#17101)


------------------------------------------
[...truncated 54.92 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.8.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=9a3e714c870932dc452b8a1806c1d3f9465c5a864b0015443987e50d9b580559
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001028.934009/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220403155028934979-6810'
 createTime: '2022-04-03T15:50:34.590741Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-03_08_50_34-3763690064078693377'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0403150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-03T15:50:34.590741Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-03_08_50_34-3763690064078693377]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-03_08_50_34-3763690064078693377
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_08_50_34-3763690064078693377?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-03_08_50_34-3763690064078693377 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:41.006Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:41.967Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.021Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.083Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.121Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.149Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.183Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.216Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.247Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.293Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.326Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.361Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.398Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.429Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.455Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.536Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.568Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.599Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.634Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.657Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.715Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.747Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:50:42.792Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:14.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:14.800Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:17.926Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:25.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T15:51:48.136Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-03_08_50_34-3763690064078693377 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 697081be162843d9b094e46f89545d09 and timestamp: 1649001811.832278:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0403150534.1649001816.313489/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220403160336314397-3454'
 createTime: '2022-04-03T16:03:44.142056Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-03_09_03_42-11817159222089951991'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0403150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-03T16:03:44.142056Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-03_09_03_42-11817159222089951991]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-03_09_03_42-11817159222089951991
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_09_03_42-11817159222089951991?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-03_09_03_42-11817159222089951991 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.081Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.720Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.750Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.829Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.902Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:49.938Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.003Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.121Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.189Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.270Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.335Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.399Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.595Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.627Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.662Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.694Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.729Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.785Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.814Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:03:50.856Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:26.254Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:36.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-03T16:04:59.755Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-03_09_03_42-11817159222089951991 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a1dff3675aa84004b439c528cd3e997b and timestamp: 1649002591.7351391:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 119
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a1dff3675aa84004b439c528cd3e997b and timestamp: 1649002591.7351391:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 119
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_bec1b1ba-bd86-4cb4-93f2-0b2ea5367842_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_08_50_34-3763690064078693377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-03_09_03_42-11817159222089951991?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 9s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7egr6lhpxfbas

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #662

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/662/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-14133] Fix potential NPE in BigQueryServicesImpl.getErrorInfo

[Robert Bradshaw] Revert "Revert "[BEAM-14038] Auto-startup for Python expansion service.

[Robert Bradshaw] Skip failing test for now.

[Kyle Weaver] [BEAM-14225] load balance jenkins jobs

[noreply] [BEAM-14153] Reshuffled Row Coder PCollection used as Side Input cause

[noreply] delint go sdk (#17247)

[Heejong Lee] add test

[noreply] Merge pull request #16841 from [BEAM-8823] Make FnApiRunner work by

[noreply] [BEAM-14192] Update legacy container version (#17210)

[noreply] Fix mishandling of API with BQIO (#17211)

[noreply] [BEAM-14221] Update documentation with Flink on Dataproc features

[Kiley Sok] Revert "[BEAM-14190] Python sends dataflow schema field"


------------------------------------------
[...truncated 55.64 KB...]
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2711991 sha256=3ebda5ee9610893a97c131316bde15d66d185883ff33de271eca12a2609b369b
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.32 botocore-1.24.32 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648914630.974077/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220402155030975011-9943'
 createTime: '2022-04-02T15:50:37.159203Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-02_08_50_36-14063834008723349518'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0402150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-02T15:50:37.159203Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-02_08_50_36-14063834008723349518]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-02_08_50_36-14063834008723349518
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_08_50_36-14063834008723349518?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-02_08_50_36-14063834008723349518 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:42.726Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.831Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.863Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:43.957Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.004Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.034Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.058Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.091Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.122Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.184Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.218Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.277Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.415Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.450Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.483Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.516Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.541Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.658Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.692Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:44.727Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:50:59.553Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:51:23.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T15:51:50.332Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-02_08_50_36-14063834008723349518 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7cc9c8454d374700bfeef19f59cb695d and timestamp: 1648915419.3568988:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0402150535.1648915424.373994/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220402160344374877-3621'
 createTime: '2022-04-02T16:03:50.921308Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-02_09_03_50-8566020059848353751'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0402150535'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-02T16:03:50.921308Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-02_09_03_50-8566020059848353751]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-02_09_03_50-8566020059848353751
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_09_03_50-8566020059848353751?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-02_09_03_50-8566020059848353751 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:01.325Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.293Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.324Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.391Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.464Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.484Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.526Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.612Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.643Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.707Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.739Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.772Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.852Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:02.983Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.250Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.277Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.300Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.323Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.346Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.397Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.419Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:03.459Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:25.224Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:04:44.107Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-02T16:05:07.976Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-02_09_03_50-8566020059848353751 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_7d92fc2f-c1dd-416a-966d-48d7e8867ccd_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_08_50_36-14063834008723349518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-02_09_03_50-8566020059848353751?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_7d92fc2f-c1dd-416a-966d-48d7e8867ccd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 42s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/c3a4gvyrfqeng

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/661/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-14164] [Website] change styles

[Andrew Pilloud] [BEAM-14190] Python sends dataflow schema field

[noreply] [BEAM-14179] Fix possibly null value

[noreply] [BEAM-12815] Try to fix flaky Flink Post Commit (#17227)

[noreply] Add a portable job server that proxies the Dataflow backend. (#17189)

[noreply] [BEAM-14130] Implement JupyterLab extension for managing Dataproc

[Andrew Pilloud] [BEAM-13741] Remove forced calcite dependency from BaseBeamTable

[noreply] [BEAM-13951] Update release guide with pointers on updating


------------------------------------------
[...truncated 55.44 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2703619 sha256=27785b7966e64159054ac0b59e7fecbe4f4054d61a53f63b3dfb53d5d50aae85
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.1 azure-storage-blob-12.11.0 boto3-1.21.31 botocore-1.24.31 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.34 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648828246.601442/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220401155046602392-6626'
 createTime: '2022-04-01T15:50:52.719080Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-01_08_50_52-5073730340269013657'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0401150532'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-01T15:50:52.719080Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-01_08_50_52-5073730340269013657]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-01_08_50_52-5073730340269013657
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_08_50_52-5073730340269013657?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-01_08_50_52-5073730340269013657 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:50:58.558Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.277Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.379Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.460Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.527Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.560Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.583Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.661Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.685Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.709Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.743Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.771Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.933Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:04.986Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.022Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.047Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.079Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.130Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.152Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:05.184Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:28.339Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:34.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:34.856Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:51:45.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T15:52:08.877Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-01_08_50_52-5073730340269013657 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a667e2693702471fa5f954cc815adb06 and timestamp: 1648829024.9020364:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0401150532.1648829028.626891/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220401160348627794-6554'
 createTime: '2022-04-01T16:03:56.957762Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-04-01_09_03_56-765668334667913902'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0401150532'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-04-01T16:03:56.957762Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-04-01_09_03_56-765668334667913902]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-04-01_09_03_56-765668334667913902
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_09_03_56-765668334667913902?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-04-01_09_03_56-765668334667913902 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:03.405Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:10.309Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.333Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.736Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.812Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.843Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.919Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:15.988Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.032Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.072Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.207Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.242Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.274Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.345Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.378Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.444Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.476Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.542Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.582Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.614Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.636Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.670Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.735Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.772Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:16.854Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:04:22.206Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:05:01.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-04-01T16:05:25.999Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-04-01_09_03_56-765668334667913902 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9841426c87fa4ed8b4e1b7c7645fe224 and timestamp: 1648829844.7056231:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9841426c87fa4ed8b4e1b7c7645fe224 and timestamp: 1648829844.7056231:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4453e2f7-2902-4a40-a9b2-6505120e9690_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_08_50_52-5073730340269013657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-04-01_09_03_56-765668334667913902?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 58s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fmyg3c77eij6u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #660

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/660/display/redirect?page=changes>

Changes:

[brachipa] [BEAM-14094]Fix null pointer exception in HllCountInitFn

[brachipa] [BEAM-14094]Fix null pointer exception in HllCountInitFn

[noreply] Merge pull request #17149 from [BEAM-13883] [Playground] Increase test

[Kiley Sok] ignore test

[noreply] [BEAM-13948] Add unstack(), a non-deferred column operation on

[noreply] [BEAM-10976] Bundle finalization: E2E support (#17045)


------------------------------------------
[...truncated 55.28 KB...]
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2698067 sha256=c2df61a0424fc39591e2a73f77aab5d2fc598250a10c747fc874f59609085688
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.30 botocore-1.24.30 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.33 tenacity-5.1.5 testcontainers-3.5.0 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648741844.821784/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220331155044822732-6144'
 createTime: '2022-03-31T15:50:52.326556Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-31_08_50_51-9882255125219877299'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0331150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-31T15:50:52.326556Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-31_08_50_51-9882255125219877299]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-31_08_50_51-9882255125219877299
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_08_50_51-9882255125219877299?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-31_08_50_51-9882255125219877299 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:50:56.483Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.520Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.620Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.685Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.717Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.747Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.788Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.857Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.890Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:02.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.020Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.130Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.162Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.189Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.222Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.254Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.313Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.341Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:03.381Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:17.875Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:51:47.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T15:52:11.409Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-31_08_50_51-9882255125219877299 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d6b729a782fe4977b59584ac1bac9c13 and timestamp: 1648742655.0827138:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 96
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0331150537.1648742659.055022/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220331160419055933-9268'
 createTime: '2022-03-31T16:04:26.398123Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-31_09_04_25-4322660594425624147'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0331150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-31T16:04:26.398123Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-31_09_04_25-4322660594425624147]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-31_09_04_25-4322660594425624147
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_09_04_25-4322660594425624147?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-31_09_04_25-4322660594425624147 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:31.979Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.470Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.497Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.580Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.640Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.668Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.728Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.818Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.858Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.888Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.917Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:33.969Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.002Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.111Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.215Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.276Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.296Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.316Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.375Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.397Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.446Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.481Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:04:34.504Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:00.127Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:19.129Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-31T16:05:46.115Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-31_09_04_25-4322660594425624147 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1e1a2131175b4408a6574afeaeec8b62 and timestamp: 1648743616.770134:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 307
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1e1a2131175b4408a6574afeaeec8b62 and timestamp: 1648743616.770134:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 307
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_08_50_51-9882255125219877299?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-31_09_04_25-4322660594425624147?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_bf134323-af7b-4376-ab96-fa3ad834cbae_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w3nixy4bf6twq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #659

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/659/display/redirect?page=changes>

Changes:

[vachan] Update display data to include BQ information.

[noreply] Revert "[BEAM-14084] iterable_input_value_types changed from list to

[egalpin] [BEAM-14003] Adds compat for Elasticsearch 8.0.0

[egalpin] [BEAM-13136] Removes support for Elasticsearch 2.x

[Valentyn Tymofieiev] Ensure the removed option prebuild_sdk_container_base_image not used on

[noreply] Merge pull request #17202 from [BEAM-14194]: Disallow autoscaling for

[noreply] Merge pull request #17080 from [BEAM-13880] [Playground] Increase test

[noreply] Merge pull request #17050 from [BEAM-13877] [Playground] Increase test

[noreply] [BEAM-14200] Improve SamzaJobInvoker extensibility (#17212)

[noreply] Merge pull request #17148 from [BEAM-14042] [playground] Scroll imports

[noreply] [BEAM-13918] Increase datastoreio go sdk unit test coverage (#17173)

[noreply] Merge pull request #16819: [BEAM-13806] Adding test suite for Go x-lang


------------------------------------------
[...truncated 55.40 KB...]
  Using cached more_itertools-8.12.0-py3-none-any.whl (54 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2696577 sha256=0c0b5adc4a1e1dfac126265ddf023a269ec5c8d9de43f9f351aa7c5902cc0e55
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.29 botocore-1.24.29 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.3 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648655435.430726/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220330155035431715-7021'
 createTime: '2022-03-30T15:50:43.091910Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-30_08_50_42-14528617402994523475'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0330150617'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-30T15:50:43.091910Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-30_08_50_42-14528617402994523475]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-30_08_50_42-14528617402994523475
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_08_50_42-14528617402994523475?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-30_08_50_42-14528617402994523475 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:52.213Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.760Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.785Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.849Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.886Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.915Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.948Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:53.982Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.026Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.105Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.143Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.400Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.439Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.467Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.503Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.537Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.597Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.626Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:50:54.657Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:51:05.418Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:51:46.751Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T15:52:10.840Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-30_08_50_42-14528617402994523475 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 06cf7d105b7f4fda98fa7ccc7b5b06d1 and timestamp: 1648656240.7515266:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0330150617.1648656245.695693/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220330160405696596-1170'
 createTime: '2022-03-30T16:04:13.427193Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-30_09_04_13-606840261092082655'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0330150617'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-30T16:04:13.427193Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-30_09_04_13-606840261092082655]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-30_09_04_13-606840261092082655
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_09_04_13-606840261092082655?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-30_09_04_13-606840261092082655 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:18.935Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.275Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.329Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.415Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.499Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.527Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.591Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.681Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.720Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.853Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.897Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.918Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.969Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:22.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.024Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.067Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.099Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.127Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.156Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.208Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.242Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.274Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.356Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.394Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:23.445Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:04:50.276Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:05:08.173Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-30T16:05:34.676Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-30_09_04_13-606840261092082655 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 169c243ef62e494a946d94b7ccb7c904 and timestamp: 1648657182.8920436:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 267
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 169c243ef62e494a946d94b7ccb7c904 and timestamp: 1648657182.8920436:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 267
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_08_50_42-14528617402994523475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-30_09_04_13-606840261092082655?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_8dce2603-2890-4d74-9c68-7372a0fda6bb_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 20s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wbdzybce2jtws

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #658

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/658/display/redirect?page=changes>

Changes:

[hengfeng] feat: remove the metadata table after the pipeline finishes

[thiagotnunes] test: add test for metadata table dropping

[noreply] [BEAM-14177] Fix GBK re-iteration caching for portable runners. (#17184)

[noreply] Merge pull request #17187: [BEAM-14181] Make sure to evict connections

[noreply] Only reset transform.label if it is correctly assigned (#17192)

[noreply] [BEAM-12641] Use google-auth instead of oauth2client for GCP auth

[Robert Bradshaw] [BEAM-14163] Fix typo in single core per container logic.

[chamikaramj] Convert URLs to local jars when constructing filesToStage

[thiagotnunes] test: disable SpannerIO.readChangeStream test

[noreply] Merge pull request #17164 from [BEAM-14140][Playground] Fix Deploy

[noreply] Merge pull request #16855 from [BEAM-13938][Playground] Increase test

[noreply] [BEAM-13314]Revise recommendations to manage Python pipeline


------------------------------------------
[...truncated 54.41 KB...]
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.2-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2696331 sha256=7dbaec09403d781bb2038730197786cf3b7429f103fa94f0b16b807761a74aba
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.11.0 boto3-1.21.28 botocore-1.24.28 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.7.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.2 wrapt-1.14.0

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569127.950120/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220329155207951052-6507'
 createTime: '2022-03-29T15:52:14.581687Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-29_08_52_14-11276422842038304298'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0329150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-29T15:52:14.581687Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-29_08_52_14-11276422842038304298]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-29_08_52_14-11276422842038304298
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_08_52_14-11276422842038304298?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-29_08_52_14-11276422842038304298 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:20.689Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.039Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.089Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.179Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.241Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.287Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.321Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.361Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.446Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.539Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.573Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.688Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.733Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.877Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.932Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:23.974Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.047Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.086Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.202Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.271Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:24.308Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:52:54.794Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:53:10.415Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T15:53:34.202Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-29_08_52_14-11276422842038304298 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 739a2f29538c4739ba0cc214faefdd44 and timestamp: 1648569907.4379876:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0329150534.1648569912.632737/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220329160512633711-9694'
 createTime: '2022-03-29T16:05:19.050028Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-29_09_05_18-3961140471685971338'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0329150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-29T16:05:19.050028Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-29_09_05_18-3961140471685971338]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-29_09_05_18-3961140471685971338
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_09_05_18-3961140471685971338?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-29_09_05_18-3961140471685971338 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:24.113Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.010Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.043Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.106Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.204Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.239Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.294Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.347Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.384Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.419Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.442Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.529Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.644Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.706Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.845Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.882Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.923Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:25.974Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.026Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.063Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.125Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.187Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:26.244Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:05:36.991Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:06:10.164Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-29T16:06:35.744Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-29_09_05_18-3961140471685971338 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4eca1ae65ea84f6f8c8edaae99903ff9 and timestamp: 1648570921.8876903:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 351
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4eca1ae65ea84f6f8c8edaae99903ff9 and timestamp: 1648570921.8876903:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 351
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_e6af68a0-c6e7-483b-8c3c-4d4606db5eaf_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_08_52_14-11276422842038304298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-29_09_05_18-3961140471685971338?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 39s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/aoqce3afhscwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #657

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/657/display/redirect?page=changes>

Changes:

[noreply] Minor: Add warning about pubsub client to Beam 2.36.0 blog (#17188)


------------------------------------------
[...truncated 55.66 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=9a24c2940a225b86a6b44ca9d4bb5cf3f55046881f49947379a1f90e15c0be4f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648482639.626516/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220328155039627465-7907'
 createTime: '2022-03-28T15:50:45.040438Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-28_08_50_44-6445254387771334530'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0328150538'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-28T15:50:45.040438Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-28_08_50_44-6445254387771334530]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-28_08_50_44-6445254387771334530
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_08_50_44-6445254387771334530?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-28_08_50_44-6445254387771334530 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:56.886Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.712Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.770Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.828Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.865Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.901Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.932Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:57.973Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.045Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.070Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.133Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.166Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.297Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.330Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.386Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.421Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.455Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.564Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.591Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:50:58.620Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:51:30.532Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:51:38.465Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T15:52:02.872Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.791Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.838Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.875Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.916Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:00:16.951Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-28_08_50_44-6445254387771334530 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a80a679d3bf244a583d9bf116787aa2f and timestamp: 1648483367.3406136:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 110
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0328150538.1648483371.529303/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220328160251530260-3259'
 createTime: '2022-03-28T16:02:57.547576Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-28_09_02_57-2745958637978817641'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0328150538'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-28T16:02:57.547576Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-28_09_02_57-2745958637978817641]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-28_09_02_57-2745958637978817641
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_09_02_57-2745958637978817641?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-28_09_02_57-2745958637978817641 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:04.546Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.631Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.743Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.806Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.874Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.903Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:10.970Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.116Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.164Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.197Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.328Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.361Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.414Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.446Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.522Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.558Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.595Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.621Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.653Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.687Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.751Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.775Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:11.804Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:31.882Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:42.275Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:42.308Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:03:52.539Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-28T16:04:14.937Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-28_09_02_57-2745958637978817641 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 75d0113e14884cc1b7f3f2d836a66644 and timestamp: 1648484182.5431263:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 125
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 75d0113e14884cc1b7f3f2d836a66644 and timestamp: 1648484182.5431263:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 125
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_08_50_44-6445254387771334530?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-28_09_02_57-2745958637978817641?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_17c5e214-3015-45a5-9087-fa0860095e93_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/kyg2nzlxwgygy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #656

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/656/display/redirect>

Changes:


------------------------------------------
[...truncated 55.75 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=5a6a666952937f3d8e7a3cf289d52c3f9532319f6fd4fbcedf2d9305c100ae81
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396241.661936/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220327155041662864-2441'
 createTime: '2022-03-27T15:50:48.179114Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-27_08_50_47-4711899144640281827'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0327150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-27T15:50:48.179114Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-27_08_50_47-4711899144640281827]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-27_08_50_47-4711899144640281827
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_08_50_47-4711899144640281827?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-27_08_50_47-4711899144640281827 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:53.738Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.697Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.785Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.881Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.921Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.957Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:54.993Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.054Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.277Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.337Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.497Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.542Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.580Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.745Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.832Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.874Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.909Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:55.942Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.025Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.065Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:50:56.115Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:30.322Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:30.362Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:32.660Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:51:40.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T15:52:05.790Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:15.965Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.124Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.171Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.259Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:00:16.313Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-27_08_50_47-4711899144640281827 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fa6d761871f742dc9e713ab85868701b and timestamp: 1648396974.9372354:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 119
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0327150527.1648396981.550581/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220327160301551488-3847'
 createTime: '2022-03-27T16:03:09.204265Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-27_09_03_08-18240558474435659941'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0327150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-27T16:03:09.204265Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-27_09_03_08-18240558474435659941]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-27_09_03_08-18240558474435659941
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_09_03_08-18240558474435659941?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-27_09_03_08-18240558474435659941 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:14.896Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.114Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.149Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.214Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.261Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.290Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.358Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.401Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.430Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.494Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.528Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.639Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.665Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.697Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.752Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.794Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.827Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.872Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.896Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.942Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:16.973Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.005Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.038Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.091Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.122Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:17.163Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:46.555Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:03:56.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-27T16:04:20.791Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-27_09_03_08-18240558474435659941 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 080604e9c65f49e1b06e830927112dae and timestamp: 1648397774.3947783:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 92
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 080604e9c65f49e1b06e830927112dae and timestamp: 1648397774.3947783:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 92
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_08_50_47-4711899144640281827?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-27_09_03_08-18240558474435659941?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_65637f39-760d-49d9-94e8-bf85955f8f25_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 52s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5jmplau6i4eve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #655

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/655/display/redirect?page=changes>

Changes:

[ryanthompson591] iterable_input_value_types will now be an iterable, I don't anticipate

[marco.robles] [BEAM-8218] PulsarIO Connector

[benjamin.gonzalez] [BEAM-12572] Change examples jobs to run as cron jobs

[benjamin.gonzalez] [BEAM-12572] SpotlessApply

[Robert Bradshaw] [BEAM-14171] More explicit asserts in CoGBKResult.

[Robert Bradshaw] Add some comments.

[noreply] [BEAM-14160] Parse filesToStage in Java expansion service (#17167)

[chamikaramj] Mapped JOB_STATE_RESOURCE_CLEANING_UP to RESOURCE_CLEANING_UP in Python

[noreply] Explicitly import estimator from tensorflow (#17168)


------------------------------------------
[...truncated 55.62 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695994 sha256=095ff4f884c2ad0d5f1869f9147e985924402f8d9517b85756d3f9936cfe7fc9
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.27 botocore-1.24.27 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648309842.291760/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220326155042292667-2331'
 createTime: '2022-03-26T15:50:49.875405Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-26_08_50_49-8532487721309707138'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0326150513'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-26T15:50:49.875405Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-26_08_50_49-8532487721309707138]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-26_08_50_49-8532487721309707138
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_08_50_49-8532487721309707138?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-26_08_50_49-8532487721309707138 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:55.069Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.077Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.106Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.173Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.226Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.255Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.277Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.301Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.340Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.425Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.458Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.526Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.575Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.654Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.684Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.713Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.746Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.786Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.834Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.866Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:50:56.904Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:51:25.368Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:51:39.529Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T15:52:05.552Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.727Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.791Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.832Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.882Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:00:23.919Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-26_08_50_49-8532487721309707138 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6464f000845e4070acca5329e1fd7143 and timestamp: 1648310577.4501204:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 108
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0326150513.1648310581.289617/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220326160301290512-5660'
 createTime: '2022-03-26T16:03:08.513074Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-26_09_03_08-10414373202299082515'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0326150513'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-26T16:03:08.513074Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-26_09_03_08-10414373202299082515]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-26_09_03_08-10414373202299082515
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_09_03_08-10414373202299082515?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-26_09_03_08-10414373202299082515 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:13.739Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.453Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.492Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.568Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.637Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.672Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.779Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.864Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.918Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.951Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:14.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.123Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.155Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.217Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.251Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.314Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.403Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.449Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.478Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.512Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.545Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.615Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.651Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:15.680Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:32.455Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:03:56.785Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-26T16:04:21.619Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-26_09_03_08-10414373202299082515 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9d68a4266ec94d059beda3ee85f4f608 and timestamp: 1648311393.4949863:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 99
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 9d68a4266ec94d059beda3ee85f4f608 and timestamp: 1648311393.4949863:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 99
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_08_50_49-8532487721309707138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-26_09_03_08-10414373202299082515?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_403a40d6-3d73-4c90-b464-ed934d7d2bb3_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 11s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bgzbatlsr7t76

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #654

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/654/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14139] Drop support for Flink 1.11.

[Kyle Weaver] [BEAM-14139] Remove obsolete reference to Flink 1.11.

[Kyle Weaver] [BEAM-14139] Update list of supported Flink versions.

[Kyle Weaver] [BEAM-14139] Update CHANGES.md

[noreply] [BEAM-14157] Don't call requestObserver.onNext on a closed windmill

[noreply] Minor: Make IOTypeHints a real NamedTuple (#17174)

[noreply] [BEAM-14172] Update tox.ini for pydocs (#17176)

[noreply] [BEAM-14065] Upgrade vendored bytebuddy to version 1.12.8 (#17028)


------------------------------------------
[...truncated 55.61 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Collecting grpcio-status>=1.16.0
  Using cached grpcio_status-1.44.0-py3-none-any.whl (10.0 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695940 sha256=ee580f7c9136b480b72044fc5692c7a31faf2f852a52a5d1a3efcb5524b140dc
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.26 botocore-1.24.26 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648223442.385099/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220325155042386065-4565'
 createTime: '2022-03-25T15:50:51.112843Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-25_08_50_49-14711236213746615654'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0325150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-25T15:50:51.112843Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-25_08_50_49-14711236213746615654]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-25_08_50_49-14711236213746615654
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_08_50_49-14711236213746615654?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-25_08_50_49-14711236213746615654 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.135Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:58.951Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.054Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.119Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.153Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.189Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.221Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.261Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.436Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.494Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.683Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.769Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.811Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.846Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.893Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:50:59.951Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:00.014Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:00.035Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:02.093Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:13.520Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:51:42.249Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T15:52:07.608Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:16.909Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.222Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.278Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.330Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:00:17.373Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-25_08_50_49-14711236213746615654 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ce215b7f561645c6be63ed72ab4f525a and timestamp: 1648224162.8593292:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 117
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0325150536.1648224167.234807/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220325160247235831-9950'
 createTime: '2022-03-25T16:02:54.267613Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-25_09_02_53-5305199909241581353'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0325150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-25T16:02:54.267613Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-25_09_02_53-5305199909241581353]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-25_09_02_53-5305199909241581353
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_09_02_53-5305199909241581353?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-25_09_02_53-5305199909241581353 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:04.048Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.113Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.140Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.229Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.292Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.320Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.395Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.450Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.491Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.536Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.570Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.636Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.719Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.743Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.782Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.813Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.847Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.906Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.936Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.967Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:07.990Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.011Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.056Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.094Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:08.116Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:34.138Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:37.130Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:37.263Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:03:47.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-25T16:04:10.048Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-25_09_02_53-5305199909241581353 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 37b1e39389aa48138b07c3dfe5914a6e and timestamp: 1648224953.9698627:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 37b1e39389aa48138b07c3dfe5914a6e and timestamp: 1648224953.9698627:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_08_50_49-14711236213746615654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-25_09_02_53-5305199909241581353?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_fdeede52-ff6f-4b0c-9e9c-9fcddfbd3150_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 33s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ym5nv7kpz3q2o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #653

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/653/display/redirect?page=changes>

Changes:

[bulat.safiullin] [BEAM-13976] [Website] update homepage

[bulat.safiullin] [BEAM-13976] [Website] update homepage, add logo

[bulat.safiullin] [BEAM-13976] [Website] update text

[bulat.safiullin] [BEAM-13976] [Website] Update Community landing page

[bulat.safiullin] [BEAM-13979] [Website] Update Community/Contact us page

[bulat.safiullin] [BEAM-13979] [Website] update title

[bulat.safiullin] [BEAM-13979] [Website] delete space

[bulat.safiullin] [BEAM-13979] [Website] add Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] delete Beam Playground

[bulat.safiullin] [BEAM-13976] [Website] change navbar css links rules, delete links from

[bulat.safiullin] [BEAM-13977] [Website] delete available-contact-channels on mobile

[bulat.safiullin] [BEAM-13976] [Website] change padding size between the sections

[bulat.safiullin] [BEAM-13976] [Website] change title to capital letters

[bulat.safiullin] [BEAM-13976] [Website] change title

[bulat.safiullin] [BEAM-14040] [Website] create new page, add link

[bulat.safiullin] [BEAM-13977] [Website] change title

[bulat.safiullin] [BEAM-13979] [Website] change text

[bulat.safiullin] [BEAM-13976] [Website] change text

[bulat.safiullin] [BEAM-13977] [Website] change text, add capital letters

[bulat.safiullin] [BEAM-13976] [Website] add playground sass, change text-align

[bulat.safiullin] [BEAM-14040] [Website] add io connectors table

[bulat.safiullin] [BEAM-13976] [Website] add playground section, add empty line

[bulat.safiullin] [BEAM-14040] [Website] add overflow to css, add table content

[bulat.safiullin] [BEAM-14040] [Website] change ✘ for ✔, add license, add br

[bulat.safiullin] [BEAM-14040] [Website] add empty line

[bulat.safiullin] [BEAM-14040] [Website] change td

[bulat.safiullin] [BEAM-14041] [Website] update built io transforms

[bulat.safiullin] [BEAM-14041] [Website] move connectors from Miscellaneous to Database

[bulat.safiullin] [BEAM-14040] [Website] change links color

[danielamartinmtz] Updated metrics' CronJob API to use the latest batch version.

[bulat.safiullin] [BEAM-14041] [Website] change IO from go to java

[bulat.safiullin] [BEAM-14040] [Website] change links, change specific version to current

[danielamartinmtz] Updated cluster to test in metrics-upgrade-clone in BeamMetrics_Publish

[aydar.zaynutdinov] [BEAM-13976][Website]

[aydar.zaynutdinov] [BEAM-14040][Website]

[aydar.zaynutdinov] [BEAM-14041][Website]

[danielamartinmtz] Updated StateFulSet k8s obejct in cassandra-svc-statefulset.yaml file in

[danielamartinmtz] Updated documentation including cluster specs.

[noreply] Beam 13058 k8s apis upgrade - elasticsearch (#18)

[danielamartinmtz] Removed code used for testing.

[danielamartinmtz] Removed code used for testing in job_PostCommit_BeamMetrics_Publish

[noreply] Beam 13058 k8s apis upgrade - Adding Basic Auth details in documentation

[Pablo Estrada] [BEAM-14151] Excluding Spanner CDC tests from Dataflow V1 suite

[danielamartinmtz] Added comments in initContainers and remove unused code in elasticsearch

[noreply] [BEAM-12697] Add primitive field generation from IR to SBE extension

[noreply] [BEAM-13889] Add test cases to jsonx package (#17124)

[noreply] Remove unreachable code in container.go (#17166)

[noreply] Add ability to handle streaming input to AvroSchemaIOProvider (#17126)

[noreply] [BEAM-12898] Flink Load Tests failure- UncheckedExecutionException -

[Daniel Oliveira] Moving to 2.39.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 55.48 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.39.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.39.0.dev0-py3-none-any.whl size=2695955 sha256=7b835f32ce796b07e7c1f7dacb678bfad0c4315420300002f7937db0ce42335a
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.39.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.25 botocore-1.24.25 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.45.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137076.483940/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220324155116486158-3179'
 createTime: '2022-03-24T15:51:23.539504Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-24_08_51_23-2019057477283921742'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0324150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-24T15:51:23.539504Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-24_08_51_23-2019057477283921742]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-24_08_51_23-2019057477283921742
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_08_51_23-2019057477283921742?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-24_08_51_23-2019057477283921742 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:42.259Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.584Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.617Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.689Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.732Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.761Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.786Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.842Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.892Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.948Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:43.983Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.096Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.237Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.279Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.314Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.347Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.383Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.462Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.496Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:51:44.561Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:14.415Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:18.007Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:18.034Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:28.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T15:52:48.461Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.160Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.237Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.278Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.323Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:01:02.348Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-24_08_51_23-2019057477283921742 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7b6876e3ede44d4d927ec766a2df853b and timestamp: 1648137820.6557474:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 107
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.39.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0324150534.1648137825.572785/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220324160345573769-1024'
 createTime: '2022-03-24T16:03:51.844999Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-24_09_03_51-3352884538278352834'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0324150534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-24T16:03:51.844999Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-24_09_03_51-3352884538278352834]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-24_09_03_51-3352884538278352834
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_09_03_51-3352884538278352834?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-24_09_03_51-3352884538278352834 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:58.338Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.469Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.503Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.562Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.635Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.675Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.732Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.811Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.843Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.867Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:03:59.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.129Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.216Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.298Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.329Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.402Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.437Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.463Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.522Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.556Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.608Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.641Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:00.696Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:12.425Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:04:40.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-24T16:05:06.417Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-24_09_03_51-3352884538278352834 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 759ba600232f4621a7c8b6780d246289 and timestamp: 1648138627.1467967:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 759ba600232f4621a7c8b6780d246289 and timestamp: 1648138627.1467967:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_08_51_23-2019057477283921742?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-24_09_03_51-3352884538278352834?project=apache-beam-testing
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_93974781-8d68-4831-ab20-0388ab0d181c_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 44s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tbxjjxwwrioy2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #652

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/652/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13232] Close clients properly in KinesisSource. Also use lazy init

[noreply] [BEAM-14141] Set Interactive Beam to use the default Dataproc image

[noreply] BEAM-14115 - Update find criteria limited to _id (#17102)

[chamikaramj] Disable BigQueryIOStorageWriteIT for general Java post-commit

[noreply] Revert "[BEAM-14038] Auto-startup for Python expansion service.

[noreply] Minor: Bump timeout for Java PreCommit (#17157)

[noreply] [BEAM-14152] Disable flaky

[noreply] Fixing a small bug in TypedSchemaTransformTest that caused it to flake.

[noreply] [BEAM-14116] Catch MonitoringInfoMetricName null keys or values in the

[noreply] [BEAM-14129] Restructure SubscriptionPartitionLoader to use a manual SDF

[noreply] [BEAM-13015] Avoid repeated weighing of StateKey in

[noreply] Add option to add modules to JDK add-open (#17110)

[noreply] [BEAM-13015] Clarify ownership of the list for state caching across

[noreply] [BEAM-14134] Optimize memory allocations for various core coders

[noreply] [BEAM-14129] Restructure PubsubLiteIO Read side to produce smaller


------------------------------------------
[...truncated 57.07 KB...]
> Task :sdks:java:harness:classes
> Task :sdks:java:harness:jar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :sdks:java:expansion-service:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648051329.503545/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220323160209504433-4100'
 createTime: '2022-03-23T16:02:18.177547Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-23_09_02_15-1936031335567360432'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0323150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-23T16:02:18.177547Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-23_09_02_15-1936031335567360432]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-23_09_02_15-1936031335567360432
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_02_15-1936031335567360432?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-23_09_02_15-1936031335567360432 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:24.405Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.420Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.447Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.512Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.574Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.600Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.633Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.665Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.703Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.761Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.795Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.828Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:25.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.036Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.065Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.094Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.115Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.147Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.196Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.232Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:26.251Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:02:40.646Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:03:09.882Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:03:34.293Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.568Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.634Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.667Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.704Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:11:41.732Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-23_09_02_15-1936031335567360432 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 568be7cce4b34445a3b0f42d17f479b4 and timestamp: 1648052050.1394997:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 110
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0323150537.1648052055.140426/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220323161415141346-2196'
 createTime: '2022-03-23T16:14:21.354513Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-23_09_14_20-7322242075389462471'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0323150537'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-23T16:14:21.354513Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-23_09_14_20-7322242075389462471]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-23_09_14_20-7322242075389462471
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_14_20-7322242075389462471?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-23_09_14_20-7322242075389462471 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:27.028Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.797Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.912Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:33.995Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.088Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.174Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.269Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.370Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.424Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.666Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.691Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.742Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.779Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.818Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:34.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.100Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.196Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.235Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.288Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.337Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.411Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.447Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:35.500Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:14:54.448Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:06.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:06.652Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:16.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-23T16:15:40.507Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-23_09_14_20-7322242075389462471 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c4256ba0071d4ff79bcee51ff1c9a8c1 and timestamp: 1648052856.3371046:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 108
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c4256ba0071d4ff79bcee51ff1c9a8c1 and timestamp: 1648052856.3371046:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 108
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_4b8cd688-8840-49e4-8c80-d4e4c3b20978_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_02_15-1936031335567360432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-23_09_14_20-7322242075389462471?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 14s
92 actionable tasks: 73 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ikszch3dtyrne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/651/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-14124] Add display data to BQ storage reads.

[noreply] fixes static checks and go lint issues (#17138)

[Kyle Weaver] Don't print in task configuration.

[noreply] [BEAM-14136] Clean up staticcheck and linter warnings in the Go SDK

[noreply] Merge pull request #17063 from [BEAM-12164] Fix flaky tests

[noreply] Revert "[BEAM-14112] Avoid storing a generator in _CustomBigQuerySource

[Kyle Weaver] [BEAM-4106] Remove filesToStage from Flink pipeline option list.

[noreply] [BEAM-14071] Enabling Flink on Dataproc for Interactive Beam (#17044)

[noreply] Minor: Bypass schema registry in schemas_test.py (#17108)


------------------------------------------
[...truncated 55.62 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2695811 sha256=6f7de31194ea5092dc7c9abd6b12437253560c9e4b8083318981db90ba867cf4
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.23 botocore-1.24.23 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964243.817059/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220322155043818043-8112'
 createTime: '2022-03-22T15:50:50.676888Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-22_08_50_49-11404046316872429164'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0322150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-22T15:50:50.676888Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-22_08_50_49-11404046316872429164]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-22_08_50_49-11404046316872429164
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_08_50_49-11404046316872429164?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-22_08_50_49-11404046316872429164 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:57.691Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.400Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.425Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.474Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.529Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.555Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.580Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.633Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.658Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.679Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.705Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.731Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.755Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.916Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.940Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:58.962Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.006Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.035Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.081Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.108Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:50:59.134Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:23.648Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:34.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:34.703Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:51:45.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T15:52:07.942Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.389Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.435Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.461Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.496Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:00:13.526Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-22_08_50_49-11404046316872429164 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 96afde8eaafb4e6a893245141458d337 and timestamp: 1647964963.6769545:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 101
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0322150536.1647964968.693656/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220322160248694593-6315'
 createTime: '2022-03-22T16:02:56.224659Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-22_09_02_55-11823124561686518281'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0322150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-22T16:02:56.224659Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-22_09_02_55-11823124561686518281]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-22_09_02_55-11823124561686518281
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_09_02_55-11823124561686518281?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-22_09_02_55-11823124561686518281 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:01.569Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.276Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.308Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.375Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.478Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.595Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.656Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.689Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.714Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.746Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.802Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.920Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:03.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.358Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.458Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.579Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.613Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.642Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.665Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.698Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.752Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.776Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:04.821Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:17.847Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:03:46.049Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-22T16:04:11.406Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-22_09_02_55-11823124561686518281 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4b79ebd830c74cd7a2c8a4261156770c and timestamp: 1647965761.5007348:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4b79ebd830c74cd7a2c8a4261156770c and timestamp: 1647965761.5007348:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_08_50_49-11404046316872429164?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-22_09_02_55-11823124561686518281?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_10dd009b-e529-4e81-b1da-c823909177c8_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3o7i7bqkjygye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/650/display/redirect?page=changes>

Changes:

[mmack] [adhoc] Move aws IT tests to testing package according to best practices


------------------------------------------
[...truncated 55.53 KB...]
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=8b19de0dc4c5f35b39d3bb42949752759993f19af92fb6831c350f5bfa7f3caf
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647877834.333198/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220321155034334118-9216'
 createTime: '2022-03-21T15:50:40.293803Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-21_08_50_39-7986734118607749467'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0321150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-21T15:50:40.293803Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-21_08_50_39-7986734118607749467]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-21_08_50_39-7986734118607749467
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_08_50_39-7986734118607749467?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-21_08_50_39-7986734118607749467 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.170Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.851Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.883Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:55.962Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.036Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.064Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.088Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.114Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.150Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.180Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.204Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.226Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.282Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.338Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.370Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.466Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.516Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.560Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.598Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.631Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.691Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:56.718Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:50:57.008Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:51:16.953Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:51:47.604Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T15:52:12.523Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.233Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.273Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.301Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.334Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:00:16.357Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-21_08_50_39-7986734118607749467 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a70f98c3857944a3855499dfe4e33d4d and timestamp: 1647878567.1928797:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 107
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0321150516.1647878573.202556/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220321160253203446-7456'
 createTime: '2022-03-21T16:03:01.180210Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-21_09_03_00-11981826017667924636'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0321150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-21T16:03:01.180210Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-21_09_03_00-11981826017667924636]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-21_09_03_00-11981826017667924636
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_09_03_00-11981826017667924636?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-21_09_03_00-11981826017667924636 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:20.128Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.181Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.214Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.295Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.396Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.430Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.518Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.589Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.633Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.671Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.737Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.797Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.903Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:21.970Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.005Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.131Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.178Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.241Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.300Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.338Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.413Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.456Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:22.494Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:03:50.473Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:04:02.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-21T16:04:27.144Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-21_09_03_00-11981826017667924636 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 2e64bd4172a94a6fa41e0be407444f18 and timestamp: 1647879382.1889186:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 2e64bd4172a94a6fa41e0be407444f18 and timestamp: 1647879382.1889186:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_08_50_39-7986734118607749467?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-21_09_03_00-11981826017667924636?project=apache-beam-testing
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_2a3364bc-3b3d-41ed-a67f-71efb77b53c1_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7ooeytqd3flf4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/649/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14122] Upgrade pip-licenses dependency (#17132)


------------------------------------------
[...truncated 55.43 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=b98999392a7be93242af5901805a2c138d4377844bdbf8c8dc143e9a41e78993
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647791442.032460/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220320155042033408-8178'
 createTime: '2022-03-20T15:50:48.912227Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-20_08_50_48-15406086144577952723'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0320150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-20T15:50:48.912227Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-20_08_50_48-15406086144577952723]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-20_08_50_48-15406086144577952723
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_08_50_48-15406086144577952723?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-20_08_50_48-15406086144577952723 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:54.603Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.494Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.525Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.578Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.618Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.652Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.683Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.759Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.800Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.882Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.915Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:55.988Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.099Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.120Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.152Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.182Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.217Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.267Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.291Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:50:56.319Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.010Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:30.993Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:51:41.333Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:52:03.591Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T15:52:03.627Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.788Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.863Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.891Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.930Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:00:08.992Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-20_08_50_48-15406086144577952723 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f4c8d84dad2349e0a4c77f6dd09e9550 and timestamp: 1647792171.0019605:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 111
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0320150521.1647792175.511248/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220320160255512247-8623'
 createTime: '2022-03-20T16:03:01.914625Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-20_09_03_01-14277975848265007722'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0320150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-20T16:03:01.914625Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-20_09_03_01-14277975848265007722]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-20_09_03_01-14277975848265007722
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_09_03_01-14277975848265007722?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-20_09_03_01-14277975848265007722 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:06.727Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.431Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.464Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.529Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.603Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.628Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.695Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.759Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.797Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.825Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.846Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.899Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.953Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:07.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.009Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.071Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.208Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.258Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.283Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.329Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.379Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.525Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.585Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:08.745Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:17.588Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:03:54.905Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:04:19.177Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-20T16:04:19.213Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-20_09_03_01-14277975848265007722 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 411244e296e848b2b8aa6378c1c82ee7 and timestamp: 1647792961.58281:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 411244e296e848b2b8aa6378c1c82ee7 and timestamp: 1647792961.58281:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 83
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_08_50_48-15406086144577952723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-20_09_03_01-14277975848265007722?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_2e599797-d216-4c1a-aa8e-6e8298db4e80_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nvf2xkvdthwjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #648

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/648/display/redirect?page=changes>

Changes:

[Kiley Sok] Add Java 17 Nexmark metrics to Grafana

[yiru] .

[yiru] .

[yiru] .

[yiru] format fix

[yiru] .

[yiru] make DoFn into a separate class

[yiru] .

[yiru] fix setting

[mmack] [BEAM-14125] Update website IO matrix to recommend aws2 IOs

[noreply] [BEAM-14128] Eliminating quadratic behavior of

[noreply] [BEAM-13972] Add RunInference interface (#16917)

[noreply] Merge pull request #17116 from [BEAM-12164] Remove change_stream in

[yiru] fix checkstyle

[yiru] spotlessapply

[noreply] Deprecate tags.go (#17025)

[noreply] [BEAM-12753] and [BEAM-12815] Fix Flink Integration Tests (#17067)

[noreply] Merge pull request #16895 from [BEAM-13882][Playground] More tests for

[noreply] [BEAM-13925] Add weekly automation to update our reviewer config

[noreply] Merge pull request #17076 from Beam 14082 update payground for mobile

[noreply] [BEAM-13925] Assign committers in the scheduled action (#17062)

[noreply] Pin setup-gcloud to v0 instead of master (#17123)

[noreply] [BEAM-3304] documentation for PaneInfo in BPG (#17047)

[noreply] Merge pull request #17016 from [BEAM-14049][Playground] Add new API

[noreply] Merge pull request #17077 from [BEAM-14078] [Website] change link

[noreply] Merge pull request #17085 from [BEAM-14077] [Website] add beam

[noreply] Update Changes.md w/Go pipeline pre-process fix.

[noreply] [BEAM-14098] wrapper for postgres on JDBC IO GO SDK (#17088)

[noreply] Merge pull request #17023 from [BEAM-12164]: Remove child partition


------------------------------------------
[...truncated 55.34 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=4e7bf0b4df5bdbf9be3c5207ecb53bbad5594cacb11adc8aa4941eeac4c9052d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.22 botocore-1.24.22 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705035.298282/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220319155035299183-9062'
 createTime: '2022-03-19T15:50:41.232485Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-19_08_50_40-4949934856950811884'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0319150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-19T15:50:41.232485Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-19_08_50_40-4949934856950811884]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-19_08_50_40-4949934856950811884
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_08_50_40-4949934856950811884?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-19_08_50_40-4949934856950811884 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:46.095Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.026Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.061Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.109Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.149Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.169Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.195Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.229Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.268Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.357Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.381Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.436Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.562Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.597Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.628Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.650Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.679Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.740Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.780Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:50:47.813Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:18.880Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:32.187Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:54.843Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T15:51:54.892Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.139Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.212Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.303Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.377Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:00:12.427Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-19_08_50_40-4949934856950811884 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5ae25cae0d6c4120966f7b1b34daaddd and timestamp: 1647705768.089736:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 119
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0319150516.1647705772.893856/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220319160252894729-1246'
 createTime: '2022-03-19T16:02:59.405582Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-19_09_02_58-17112221556829961310'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0319150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-19T16:02:59.405582Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-19_09_02_58-17112221556829961310]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-19_09_02_58-17112221556829961310
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_09_02_58-17112221556829961310?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-19_09_02_58-17112221556829961310 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:05.654Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.683Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.704Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.761Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.831Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.860Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:06.924Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.025Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.069Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.138Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.375Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.407Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.473Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.505Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.577Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.605Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.636Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.681Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.712Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.771Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.801Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:07.877Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:21.769Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:03:50.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:04:15.346Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-19T16:04:15.368Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-19_09_02_58-17112221556829961310 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 93d4667d58454293aba1219b13b4984e and timestamp: 1647706584.5855942:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 89
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 93d4667d58454293aba1219b13b4984e and timestamp: 1647706584.5855942:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 89
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_08_50_40-4949934856950811884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-19_09_02_58-17112221556829961310?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_51b9254f-d449-4d9d-a72e-0aa27aee3b8c_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zrugjq26r4u5y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/647/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10212] Clean-up comments, remove rawtypes usage.

[noreply] [BEAM-14112] Avoid storing a generator in _CustomBigQuerySource (#17100)

[noreply] Populate environment capabilities in v1beta3 protos. (#17042)

[Kyle Weaver] [BEAM-12976] Test a whole pipeline using projection pushdown in BQ IO.

[Kyle Weaver] [BEAM-12976] Enable projection pushdown for Java pipelines on Dataflow,

[noreply] [BEAM-14038] Auto-startup for Python expansion service. (#17035)

[Kyle Weaver] [BEAM-14123] Fix typo in hdfsIntegrationTest task name.

[noreply] [BEAM-13893] improved coverage of jobopts package (#17003)

[noreply] Merge pull request #16977 from [BEAM-12164]  Added integration test for

[mmack] [adhoc] Minor cleanup for aws2 tests


------------------------------------------
[...truncated 55.58 KB...]
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694808 sha256=ab5829fc1d38fefd1f6d6bc3cfe6f0a1a9c88d88f8d5755a916e444bf696e70f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.21 botocore-1.24.21 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.1 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.56.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-1.0.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647618645.399622/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220318155045400535-5117'
 createTime: '2022-03-18T15:50:54.510892Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-18_08_50_51-1213532163946750817'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0318150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-18T15:50:54.510892Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-18_08_50_51-1213532163946750817]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-18_08_50_51-1213532163946750817
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_08_50_51-1213532163946750817?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-18_08_50_51-1213532163946750817 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:04.459Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.777Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.815Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.880Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:05.944Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.004Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.044Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.078Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.130Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.164Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.199Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.232Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.320Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.357Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.520Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.560Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.591Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.627Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.661Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.760Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.817Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:06.873Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:34.595Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:51:46.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:52:11.215Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T15:52:11.247Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.816Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.955Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:21.999Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:22.032Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:00:22.097Z: JOB_MESSAGE_BASIC: Stopping **** pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-18_08_50_51-1213532163946750817 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 939eb58af10a487fb999cf943cb9b1fa and timestamp: 1647619381.7494645:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0318150529.1647619387.446200/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220318160307447095-3228'
 createTime: '2022-03-18T16:03:14.510021Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-18_09_03_13-1508967428110627461'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0318150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-18T16:03:14.510021Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-18_09_03_13-1508967428110627461]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-18_09_03_13-1508967428110627461
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_09_03_13-1508967428110627461?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-18_09_03_13-1508967428110627461 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:20.164Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.219Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.245Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.329Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.418Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.445Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.532Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.602Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.671Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.712Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.747Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.785Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:22.973Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.020Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.100Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.237Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.272Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.297Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.339Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.389Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.432Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.479Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.532Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:23.571Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:41.208Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:54.686Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:03:54.729Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:05.008Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:27.231Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-18T16:04:27.272Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-18_09_03_13-1508967428110627461 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6615e05ff94e4d018d63bd609f4a9215 and timestamp: 1647620237.4886332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 302
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6615e05ff94e4d018d63bd609f4a9215 and timestamp: 1647620237.4886332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 302
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_08_50_51-1213532163946750817?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-18_09_03_13-1508967428110627461?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_648d3d6e-488c-4e06-b60a-d7e132c815cd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4yvebz6ba443o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/646/display/redirect?page=changes>

Changes:

[noreply] Mapped JOB_STATE_RESOURCE_CLEANING_UP to State.RUNNING.

[ryanthompson591] fixed typo in typehints

[zyichi] Remove unused prebuild_sdk_container_base_iamge option from validate

[hengfeng] feat: add more custom metrics

[noreply] [BEAM-14103][Playgrounf][Bugfix] Fix google analytics id (#17092)

[noreply] Minor: Make ScopedReadStateSupplier final (#16992)

[noreply] [BEAM-14113] Improve SamzaJobServerDriver extensibility (#17099)

[noreply] [BEAM-14116] Chunk commit requests dynamically (#17004)

[noreply] Merge pull request #17079 from [BEAM-13660] Add types and queries in

[noreply] [BEAM-13888] Add unit testing to ioutilx (#17058)

[noreply] Merge pull request #16822 from [BEAM-13841][Playground] Add Application

[noreply] Minor: Make serializableCoder warning gramatically correct english

[noreply] [BEAM-14091] Fixing Interactive Beam show/collect for remote runners

[noreply] [BEAM-11934] Add enable_file_dynamic_sharding to allow DataflowRunner

[noreply] [BEAM-12777] Create symlink for `current` directory (#17105)

[noreply] [BEAM-14020] Adding SchemaTransform, SchemaTransformProvider,

[noreply] [BEAM-13015] Modify metrics to begin and reset to a non-dirty state.


------------------------------------------
[...truncated 57.07 KB...]
> Task :sdks:java:harness:jar

> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar

> Task :sdks:java:expansion-service:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:expansion-service:classes
> Task :sdks:java:expansion-service:jar

> Task :sdks:java:io:kafka:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:kafka:classes
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647532980.077380/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220317160300078338-9170'
 createTime: '2022-03-17T16:03:06.851624Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-17_09_03_06-16520541463599878889'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0317150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-17T16:03:06.851624Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_03_06-16520541463599878889]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_03_06-16520541463599878889
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_03_06-16520541463599878889 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:31.508Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.489Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.700Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:33.960Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.164Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.197Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.232Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.281Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.334Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.366Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.399Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.431Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.702Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.764Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:34.871Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.018Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.310Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.620Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.762Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:35.796Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:03:55.849Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.528Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:05.586Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:16.013Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.379Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:04:39.409Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_03_06-16520541463599878889 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 799b919e335e4da5b604bf26cf2e95f7 and timestamp: 1647533794.896252:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 118
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0317150459.1647533799.861456/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220317161639862367-7675'
 createTime: '2022-03-17T16:16:45.948426Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-17_09_16_45-14787157976749017395'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0317150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-17T16:16:45.948426Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-17_09_16_45-14787157976749017395]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-17_09_16_45-14787157976749017395
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-17_09_16_45-14787157976749017395 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:16:54.906Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:01.982Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.012Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.088Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.247Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.319Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.473Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.610Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:02.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.144Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.391Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.467Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.525Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.725Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.810Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.852Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.875Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.913Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.949Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:03.998Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.020Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.063Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:17:04.782Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:24.756Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:45.663Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:56.248Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.060Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-17T16:19:59.094Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-17_09_16_45-14787157976749017395 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3c179ef93754940a4bcf4371370c909 and timestamp: 1647534722.8106575:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 106
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_49e91cf1-c4de-45b6-a055-d0a8ceb6c521_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_03_06-16520541463599878889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-17_09_16_45-14787157976749017395?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 39s
92 actionable tasks: 73 executed, 17 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ei2hbo4fxnbac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/645/display/redirect?page=changes>

Changes:

[Chamikara Madhusanka Jayalath] Updates x-lang release validation to use staged jars

[dhuntsperger] documented maven-to-gradle conversion for Dataflow; refactored java

[dhuntsperger] adding a list of example pipelines

[dhuntsperger] removing unnecessary `ls` command from maven project generation

[dhuntsperger] fixing filename formatting in response to feedback

[dhuntsperger] adding extra step emphasizing runner setupt

[dhuntsperger] reorganized instructions to emphasize setup steps for runners

[noreply] [BEAM-13767] Move a bunch of python tasks to use gradle configuration…

[noreply] Merge pull request #17052 from [BEAM-13818] [SnowflakeIO] Add support

[noreply] Adding pydoc for StateHandler (#17091)

[noreply] BEAM-3165 Bypass split if numSplit is zero (#17084)


------------------------------------------
[...truncated 55.58 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694465 sha256=187f3c74051128e768a8b63dfe13c46b30a9a2d1482e8320b1f0fb2c855bab4d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.20 botocore-1.24.20 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.2 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.9 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647445838.307384/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220316155038308306-2768'
 createTime: '2022-03-16T15:50:45.005508Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-16_08_50_44-8172013573730558483'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0316150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-16T15:50:45.005508Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-16_08_50_44-8172013573730558483]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-16_08_50_44-8172013573730558483
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-16_08_50_44-8172013573730558483?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-16_08_50_44-8172013573730558483 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:51.060Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.044Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.084Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.184Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.211Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.238Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.274Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.306Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.365Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.382Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.438Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.459Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.483Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.517Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.561Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.614Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.845Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.885Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.907Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.934Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:52.960Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:53.019Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:53.062Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:50:53.094Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:51:22.252Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:51:29.810Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:51:29.839Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:51:40.154Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:52:01.823Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T15:52:01.854Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-16_08_50_44-8172013573730558483 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 491c97ba578f44f1804b60730a150d36 and timestamp: 1647446646.4082:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 115
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0316150529.1647446651.306363/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220316160411307253-3950'
 createTime: '2022-03-16T16:04:17.716924Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-16_09_04_17-9486482442961711439'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0316150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-16T16:04:17.716924Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-16_09_04_17-9486482442961711439]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-16_09_04_17-9486482442961711439
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-16_09_04_17-9486482442961711439?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-16_09_04_17-9486482442961711439 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:26.224Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.273Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.308Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.396Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.470Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.495Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.556Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.612Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.651Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.748Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.768Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.812Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.842Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.875Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.921Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.951Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:31.974Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.031Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.063Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.102Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.121Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.152Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.208Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.239Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.296Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.327Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:32.360Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:04:57.567Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:05:07.761Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:05:07.781Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:05:18.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:05:42.967Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-16T16:05:43.029Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-16_09_04_17-9486482442961711439 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3494b1f2263047da9b4e11e8d2ffc307 and timestamp: 1647447601.0969312:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 266
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3494b1f2263047da9b4e11e8d2ffc307 and timestamp: 1647447601.0969312:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 266
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-16_08_50_44-8172013573730558483?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-16_09_04_17-9486482442961711439?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_e1b5bfc0-6567-4e21-8f01-0cca8fae8022_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 39s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hp5n57t6b5orc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/644/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] [BEAM-12572] Add DistinctExample test

[benjamin.gonzalez] [BEAM-12572] Add DistinctExample test

[benjamin.gonzalez] [BEAM-12572] Add java examples tests

[benjamin.gonzalez] [BEAM-12572] Change tests collected by runner

[benjamin.gonzalez] [BEAM-12572] Fix PipelineOptions, skip tests for specific runners

[benjamin.gonzalez] [BEAM-12572] Skip failing tests

[benjamin.gonzalez] [BEAM-12572] Fix checkstyle warnings

[benjamin.gonzalez] [BEAM-12572] Skip tfidf tests on direct runner

[benjamin.gonzalez] [BEAM-12572] Fix PipelineOptions

[benjamin.gonzalez] [BEAM-12572] Fix checkstyle warning

[benjamin.gonzalez] [BEAM-12572] Enable test with Pipeline serialization error fixed, link

[benjamin.gonzalez] [BEAM-12572] Sickbay TfIdf test for DirectRunner

[benjamin.gonzalez] [BEAM-12572] Refactor naming for BigQueryClient

[mmack] [BEAM-13175] Add KinesisIO.write for AWS SDK v2.

[mmack] [adhoc] Update owners for aws modules

[noreply] [BEAM-14051] Add a warning to gradle.properties that many Beam modules

[noreply] Merge pull request #17000 from [BEAM-13881] [Playground] Increase test

[noreply] Add deterministic dict coding via key sorting. (#17037)

[noreply] Merge pull request #17054 from [BEAM-14075] [SnowflakeIO] Change a

[mmack] [BEAM-13175] Document changes to IOs in amazon-web-services2 for AWS


------------------------------------------
[...truncated 55.97 KB...]
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2694174 sha256=71da00250e33f1fb777878a52a6f210861cd1ac54002b341b75e6d4797cdef60
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.19 botocore-1.24.19 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647359539.070562/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220315155219071519-5846'
 createTime: '2022-03-15T15:52:25.553421Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-15_08_52_25-1665017609239857947'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0315150523'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-15T15:52:25.553421Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-15_08_52_25-1665017609239857947]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-15_08_52_25-1665017609239857947
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-15_08_52_25-1665017609239857947?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-15_08_52_25-1665017609239857947 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:31.665Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.280Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.301Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.362Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.394Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.419Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.441Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.464Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.497Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.522Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.548Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.575Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.631Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.680Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.771Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.809Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.853Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.882Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.912Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.957Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:32.979Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:52:33.001Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:53:06.505Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:53:18.716Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:53:43.304Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T15:53:43.356Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-15_08_52_25-1665017609239857947 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 912b377957ec4850b1c23873ae0bc9d0 and timestamp: 1647360333.204236:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 100
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0315150523.1647360337.189817/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220315160537190818-3706'
 createTime: '2022-03-15T16:05:43.940395Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-15_09_05_43-18375466865137123585'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0315150523'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-15T16:05:43.940395Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-15_09_05_43-18375466865137123585]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-15_09_05_43-18375466865137123585
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-15_09_05_43-18375466865137123585?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-15_09_05_43-18375466865137123585 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:50.817Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.061Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.124Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.211Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.352Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.403Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.481Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.586Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.660Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.717Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.761Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.796Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.940Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:54.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.413Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.449Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.481Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.538Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.579Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.669Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.707Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:05:55.804Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:06:08.221Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:06:40.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:07:05.100Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-15T16:07:05.158Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-15_09_05_43-18375466865137123585 after 602 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_cdd7dfe2-d1ee-4a53-b15e-3b306c46cfe7_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-15_08_52_25-1665017609239857947?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-15_09_05_43-18375466865137123585?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_cdd7dfe2-d1ee-4a53-b15e-3b306c46cfe7_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 29s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mvxktixvokln6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/643/display/redirect>

Changes:


------------------------------------------
[...truncated 56.44 KB...]
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693741 sha256=2b0e346294f05926ec351a655674d3a05145deaa21519a7d489415dc991d828f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.18 botocore-1.24.18 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220314155048200808-5100'
 createTime: '2022-03-14T15:50:56.315646Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-14_08_50_55-745859959694178126'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0314150526'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-14T15:50:56.315646Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-14_08_50_55-745859959694178126]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-14_08_50_55-745859959694178126
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_08_50_55-745859959694178126?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-14_08_50_55-745859959694178126 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:04.453Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.382Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.440Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.510Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.557Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.579Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.645Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.687Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.723Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.781Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.812Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.844Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.876Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.910Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.946Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.040Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.086Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.121Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.153Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.188Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.244Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.301Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.337Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:38.049Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:51.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:51.850Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:02.210Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:16.632Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:16.661Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-14_08_50_55-745859959694178126 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 73ac7c06ea924418946f236e1d1d9e90 and timestamp: 1647273859.559636:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 103
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220314160425915190-5284'
 createTime: '2022-03-14T16:04:32.428010Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-14_09_04_31-3446386912144349460'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0314150526'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-14T16:04:32.428010Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-14_09_04_31-3446386912144349460]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-14_09_04_31-3446386912144349460
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_09_04_31-3446386912144349460?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-14_09_04_31-3446386912144349460 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:38.866Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.088Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.118Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.172Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.262Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.280Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.350Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.432Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.463Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.558Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.653Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.736Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.789Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.854Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.882Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.916Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.948Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.981Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.015Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.060Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.107Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.138Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:13.731Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:26.373Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:54.661Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:54.720Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-14_09_04_31-3446386912144349460 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_5f1ba947-d4b1-485d-ba53-8d93cfc679c2_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_08_50_55-745859959694178126?project=apache-beam-testing
    self.result = self.pipeline.run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_09_04_31-3446386912144349460?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_5f1ba947-d4b1-485d-ba53-8d93cfc679c2_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 7s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dpmklkn6ivltk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/642/display/redirect?page=changes>

Changes:

[noreply] [BEAM-14072] [BEAM-13993] [BEAM-10039] Import beam plugins before


------------------------------------------
[...truncated 54.93 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693741 sha256=d3b20b5b622aeeb36b070fc154735d410dc1dbcd2a9cd98f4216f28b59148060
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.18 botocore-1.24.18 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647186625.612482/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220313155025613433-4879'
 createTime: '2022-03-13T15:50:31.362028Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-13_08_50_30-11357454731140285663'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0313150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-13T15:50:31.362028Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-13_08_50_30-11357454731140285663]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-13_08_50_30-11357454731140285663
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-13_08_50_30-11357454731140285663?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-13_08_50_30-11357454731140285663 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:35.058Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.390Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.422Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.530Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.570Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.598Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.632Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.665Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.706Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.734Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.767Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.799Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.834Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:38.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.002Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.037Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.065Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.098Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.129Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.186Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.217Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:39.259Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:50:55.994Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:51:16.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:51:47.386Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T15:51:47.416Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-13_08_50_30-11357454731140285663 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3d90db47c2a04a3bba3270e87aa9b580 and timestamp: 1647187414.1611283:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0313150527.1647187417.814884/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220313160337815763-5715'
 createTime: '2022-03-13T16:03:43.530970Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-13_09_03_43-1655120582378969613'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0313150527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-13T16:03:43.530970Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-13_09_03_43-1655120582378969613]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-13_09_03_43-1655120582378969613
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-13_09_03_43-1655120582378969613?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-13_09_03_43-1655120582378969613 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:48.941Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.661Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.687Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.768Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.841Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.869Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.922Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:50.976Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.013Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.039Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.093Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.126Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.149Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.171Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.226Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.295Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.328Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.351Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.383Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.426Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.454Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.487Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.516Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.548Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.569Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.629Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.660Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:03:51.735Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:23.885Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:25.882Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:25.917Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:36.151Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:59.324Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-13T16:04:59.353Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-13_09_03_43-1655120582378969613 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 30113345a6c4456bb2a4df8bc447bef2 and timestamp: 1647188496.1391811:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 415
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 30113345a6c4456bb2a4df8bc447bef2 and timestamp: 1647188496.1391811:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 415
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-13_08_50_30-11357454731140285663?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-13_09_03_43-1655120582378969613?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_2d9a260e-4e91-4d87-9795-dfa899925c48_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 14s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/g4k6t74po2dmw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/641/display/redirect?page=changes>

Changes:

[Ismaël Mejía] [BEAM-13981] Remove Spark Runner specific code for event logging

[vitaly.terentyev] [BEAM-2766] Support null key/values in HadoopFormatIO

[vitaly.terentyev] [BEAM-2766] Fix checkstyle


------------------------------------------
[...truncated 55.56 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693483 sha256=f0b460c86335a7ac42563832d8c75392f67d94eb9dba23648cfa89f2df552f60
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.18 botocore-1.24.18 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647100234.615323/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220312155034616275-7885'
 createTime: '2022-03-12T15:50:41.302224Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-12_07_50_40-8979317908193633580'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0312150525'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-12T15:50:41.302224Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-12_07_50_40-8979317908193633580]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-12_07_50_40-8979317908193633580
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-12_07_50_40-8979317908193633580?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-12_07_50_40-8979317908193633580 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:46.846Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.497Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.650Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.719Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.761Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.790Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.827Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.850Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.892Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.967Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:52.991Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.049Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.083Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.107Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.202Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.234Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.268Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.315Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.339Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.394Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.428Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:50:53.498Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:51:07.905Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:51:34.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:51:34.542Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:51:44.874Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:52:07.511Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T15:52:07.542Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-12_07_50_40-8979317908193633580 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e7bf0ab76dae4075a14dc8efd9d314ca and timestamp: 1647101049.0195248:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0312150525.1647101053.853262/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220312160413854141-5696'
 createTime: '2022-03-12T16:04:19.948069Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-12_08_04_19-11762761464762094130'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0312150525'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-12T16:04:19.948069Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-12_08_04_19-11762761464762094130]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-12_08_04_19-11762761464762094130
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-12_08_04_19-11762761464762094130?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-12_08_04_19-11762761464762094130 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:28.780Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.627Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.666Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.733Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.828Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.856Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.909Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:29.963Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.002Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.028Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.096Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.130Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.161Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.195Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.228Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.327Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.349Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.382Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.408Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.433Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.469Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.500Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.533Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.554Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.609Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.639Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:30.669Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:04:51.760Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:05:11.380Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:05:11.413Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:05:21.710Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:05:46.068Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-12T16:05:46.087Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-12_08_04_19-11762761464762094130 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3a9442c2ba0944268a96fd0d2877f0e2 and timestamp: 1647101874.8230782:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 78
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3a9442c2ba0944268a96fd0d2877f0e2 and timestamp: 1647101874.8230782:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 78
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-12_07_50_40-8979317908193633580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-12_08_04_19-11762761464762094130?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_0356a7fe-6891-4379-906b-ce6f52cc1bdd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 33s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fu6o6yezbtoxi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #640

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/640/display/redirect?page=changes>

Changes:

[ihr] [BEAM-13923] Fix the answers placeholders locations in the Java katas

[jakub.kukul] [BEAM-14039] Propagate ignore_unknown_columns parameter.

[stranniknm] [BEAM-14079] playground - improve accessibility

[noreply] [BEAM-13925] Find and address prs that havent been reviewed in a week

[noreply] Fix import path

[noreply] [BEAM-13925] Fix one more import path

[noreply] Add a StatefulDoFn test that sets event time timer within allowed

[noreply] Merge pull request #17056 from [BEAM-14076] [SnowflakeIO] Add support


------------------------------------------
[...truncated 56.01 KB...]
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693483 sha256=1bbd499f27593c5e96106e9f9752ece994d4ce69a8022e2b12e00cc4883384b9
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.17 botocore-1.24.17 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647013964.431395/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220311155244433113-1614'
 createTime: '2022-03-11T15:52:50.584379Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-11_07_52_50-8035785994620043962'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0311150505'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-11T15:52:50.584379Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-11_07_52_50-8035785994620043962]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-11_07_52_50-8035785994620043962
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-11_07_52_50-8035785994620043962?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-11_07_52_50-8035785994620043962 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:56.077Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:56.880Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:56.912Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.058Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.102Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.136Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.173Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.194Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.258Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.317Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.342Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.374Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.395Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.453Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.484Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.584Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.614Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.640Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.664Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.694Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.752Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.813Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:52:57.852Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:53:35.199Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:53:42.187Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:54:07.011Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T15:54:07.040Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-11_07_52_50-8035785994620043962 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: eda4766e464c433c91957a4b15ab7add and timestamp: 1647014758.2699215:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 131
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/dataflow-****.jar in 10 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0311150505.1647014763.260369/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220311160603261851-6118'
 createTime: '2022-03-11T16:06:16.802024Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-11_08_06_15-8684848591038411742'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0311150505'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-11T16:06:16.802024Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-11_08_06_15-8684848591038411742]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-11_08_06_15-8684848591038411742
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-11_08_06_15-8684848591038411742?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-11_08_06_15-8684848591038411742 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:21.575Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:22.718Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:22.756Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:22.894Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:22.967Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.001Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.058Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.127Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.156Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.223Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.260Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.294Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.326Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.368Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.446Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.468Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.493Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.604Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.634Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.667Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.700Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.734Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.788Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.834Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:23.871Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:06:44.495Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:07:01.168Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:07:01.199Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:07:11.524Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-11T16:07:33.988Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-11_08_06_15-8684848591038411742 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c341622340744bd686003c053585be9f and timestamp: 1647015764.6148105:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 330
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c341622340744bd686003c053585be9f and timestamp: 1647015764.6148105:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 330
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-11_07_52_50-8035785994620043962?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-11_08_06_15-8684848591038411742?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_abb59e62-fcde-4a31-bb25-e66cb99e0f87_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 51s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ybusnmiwaigwq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/639/display/redirect?page=changes>

Changes:

[hengfeng] [BEAM-12164]: display the metadata table's name on UI

[noreply] Revert "[BEAM-13993] [BEAM-10039] Import beam plugins before starting

[noreply] Merge pull request #17036 from [BEAM-12164] Convert all static instances

[noreply] fix variable reference (#16991)

[noreply] Merge pull request #16844 from [BEAM-12164]: allow for nanosecond

[noreply] [BEAM-13904] Increase unit testing in the reflectx package (#17024)


------------------------------------------
[...truncated 56.37 KB...]
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693407 sha256=e021477371dbba47706f36539f1d185aacd0a57f07d9bc6e3e6327e4ea293111
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.16 botocore-1.24.16 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646927778.627709/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220310155618628626-6338'
 createTime: '2022-03-10T15:56:25.722761Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-10_07_56_24-13058135381658350062'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308201534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-10T15:56:25.722761Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-10_07_56_24-13058135381658350062]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-10_07_56_24-13058135381658350062
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-10_07_56_24-13058135381658350062?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-10_07_56_24-13058135381658350062 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:31.032Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:31.905Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:31.946Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.071Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.101Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.156Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.184Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.220Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.251Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.281Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.310Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.338Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.374Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.426Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.483Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.563Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.595Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.626Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.651Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.686Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.734Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.775Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:32.801Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:56:55.539Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:57:13.017Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:57:37.906Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T15:57:37.938Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-10_07_56_24-13058135381658350062 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3ce2480db27542d495fea7bf31fbf1ed and timestamp: 1646928570.0150244:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 104
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646928574.749392/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220310160934750276-5142'
 createTime: '2022-03-10T16:09:42.513536Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-10_08_09_42-6827497964498722909'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308201534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-10T16:09:42.513536Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-10_08_09_42-6827497964498722909]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-10_08_09_42-6827497964498722909
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-10_08_09_42-6827497964498722909?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-10_08_09_42-6827497964498722909 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:46.985Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:48.721Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:48.784Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:48.851Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:48.925Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:48.952Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.007Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.071Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.104Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.133Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.226Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.358Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.383Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.561Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.586Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.616Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.645Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.679Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.726Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.761Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:09:49.814Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:10:26.253Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:10:26.914Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:10:26.947Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:10:37.328Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:11:00.354Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-10T16:11:00.382Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-10_08_09_42-6827497964498722909 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_042f8ebd-9f86-4da6-b367-735fdbda423a_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-10_07_56_24-13058135381658350062?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-10_08_09_42-6827497964498722909?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_042f8ebd-9f86-4da6-b367-735fdbda423a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ualgvketjiq76

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #638

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/638/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Update dataflow API client.

[Robert Bradshaw] Instructions for updating apitools generated files.

[noreply] [BEAM-13709] Inconsistent behavior when parsing boolean flags across

[noreply] [BEAM-10976] Bundle finalization: Harness and some exec changes (#16980)

[noreply] Merge pull request #16976 from [BEAM-14010] [Website] Add Playground

[noreply] [BEAM-12447] Upgrade cloud build client and add/cleanup options (#17032)


------------------------------------------
[...truncated 56.68 KB...]
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693718 sha256=e1e0309e1e05f1e236cece9da33cff4890e062eb579ae7012d7398186ff9d0d1
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.15 botocore-1.24.15 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.10.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841034.478407/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220309155034479362-4883'
 createTime: '2022-03-09T15:50:41.438786Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-09_07_50_40-9956727024899737363'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308201534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-09T15:50:41.438786Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-09_07_50_40-9956727024899737363]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-09_07_50_40-9956727024899737363
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-09_07_50_40-9956727024899737363?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-09_07_50_40-9956727024899737363 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:46.697Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.561Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.591Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.660Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.701Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.740Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.776Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.814Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.854Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.891Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.922Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.950Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:47.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.102Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.225Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.278Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.313Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.372Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.397Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.445Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.482Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:50:48.538Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:19.893Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:23.971Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:24.003Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:34.418Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:56.818Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T15:51:56.850Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-09_07_50_40-9956727024899737363 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 40c0206eefa645bd8dc7d8a9edfa6390 and timestamp: 1646841844.1007853:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 135
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308201534.1646841849.836760/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220309160409837697-9498'
 createTime: '2022-03-09T16:04:16.335245Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-09_08_04_15-5997906062407497914'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308201534'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-09T16:04:16.335245Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-09_08_04_15-5997906062407497914]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-09_08_04_15-5997906062407497914
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-09_08_04_15-5997906062407497914?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-09_08_04_15-5997906062407497914 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:21.846Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.530Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.568Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.634Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.744Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.773Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.844Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.929Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.962Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:23.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.031Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.094Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.194Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.356Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.420Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.460Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.488Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.520Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.554Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.578Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.657Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.745Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:24.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:04:49.857Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:05:13.135Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:05:37.043Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-09T16:05:37.080Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-09_08_04_15-5997906062407497914 after 600 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_5418d009-02e6-4125-9719-ac2b2ae94dbb_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-09_07_50_40-9956727024899737363?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-09_08_04_15-5997906062407497914?project=apache-beam-testing
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_5418d009-02e6-4125-9719-ac2b2ae94dbb_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 25s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w6yoqepfu7bqi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #637

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/637/display/redirect?page=changes>

Changes:

[dannymccormick] [BEAM-11085] Test that windows are correctly observed in DoFns

[jrmccluskey] [BEAM-14050] Update taxi.go example instructions

[noreply] Give pr bot write permissions on pr update

[noreply] Adding a logical type for Schemas using proto serialization. (#16940)

[noreply] BEAM-13765 missing PAssert methods (#16668)

[noreply] [BEAM-13909] improve coverage of Provision package (#17014)

[noreply] Merge pull request #17027: [BEAM-11205] Upgrade GCP Libraries BOM


------------------------------------------
[...truncated 56.71 KB...]
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2680293 sha256=e1ebc2e47e96c5f254af36cfed9df70e66058866ec0d8628f82a45ed107fe322
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.14 botocore-1.24.14 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 google-cloud-language-1.3.0 google-cloud-pubsub-2.10.0 google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.1 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646754736.472332/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220308155216473250-8212'
 createTime: '2022-03-08T15:52:23.492884Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-08_07_52_22-6208766471318514496'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308151328'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-08T15:52:23.492884Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-08_07_52_22-6208766471318514496]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-08_07_52_22-6208766471318514496
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-08_07_52_22-6208766471318514496?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-08_07_52_22-6208766471318514496 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:29.531Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:30.998Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.034Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.098Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.128Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.165Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.200Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.226Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.268Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.289Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.367Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.411Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.440Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.475Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.524Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.557Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.645Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.678Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.699Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.741Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.768Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.824Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.861Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:31.893Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:52:58.961Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:53:15.673Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:53:15.704Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:53:26.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:53:40.549Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T15:53:40.590Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-08_07_52_22-6208766471318514496 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0e4178c342df4e43900af140e9c2053f and timestamp: 1646755517.0971382:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 132
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0308151328.1646755523.404417/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220308160523405314-8432'
 createTime: '2022-03-08T16:05:30.268586Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-08_08_05_29-17240757011938371162'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0308151328'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-08T16:05:30.268586Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-08_08_05_29-17240757011938371162]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-08_08_05_29-17240757011938371162
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-08_08_05_29-17240757011938371162?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-08_08_05_29-17240757011938371162 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:37.372Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.344Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.367Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.417Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.493Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.592Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.645Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.710Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.755Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.791Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.825Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.889Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:41.994Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.028Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.096Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.234Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.290Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.345Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.377Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.404Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.460Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.482Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:42.521Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:05:59.904Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:06:28.619Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:06:51.629Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-08T16:06:51.668Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-08_08_05_29-17240757011938371162 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_da11ccdf-a3fd-4f7b-9168-2db5e4272712_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-08_07_52_22-6208766471318514496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-08_08_05_29-17240757011938371162?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_da11ccdf-a3fd-4f7b-9168-2db5e4272712_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 36s
92 actionable tasks: 62 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dinvxujff7ot2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #636

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/636/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13925] Add ability to get metrics on pr-bot performance (#16985)


------------------------------------------
[...truncated 56.45 KB...]
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2680293 sha256=9b2f7cdb4d1661a0883914fc33b6ae07ad22a3db8a8a220582ea2da444953394
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.13 botocore-1.24.13 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.2 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.10.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.1 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646668237.297039/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220307155037297945-7577'
 createTime: '2022-03-07T15:50:43.651141Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-07_07_50_43-9465715162622133418'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0307151411'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-07T15:50:43.651141Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-07_07_50_43-9465715162622133418]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-07_07_50_43-9465715162622133418
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-07_07_50_43-9465715162622133418?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-07_07_50_43-9465715162622133418 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:49.512Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.734Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.772Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.841Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.889Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.923Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:50.957Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.006Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.051Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.075Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.107Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.149Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.182Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.219Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.252Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.283Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.394Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.438Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.473Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.501Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.530Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.595Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.635Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:51.666Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:50:59.547Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:51:37.229Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:52:01.269Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T15:52:01.302Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-07_07_50_43-9465715162622133418 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b5463b727e9b4680ae4c067e8fa0594e and timestamp: 1646669021.4488835:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 105
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/dataflow-****.jar in 8 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0307151411.1646669026.804729/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220307160346805675-9525'
 createTime: '2022-03-07T16:03:56.339972Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-07_08_03_55-7219671244377936705'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0307151411'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-07T16:03:56.339972Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-07_08_03_55-7219671244377936705]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-07_08_03_55-7219671244377936705
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-07_08_03_55-7219671244377936705?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-07_08_03_55-7219671244377936705 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:00.592Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:01.664Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:01.704Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:01.773Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:01.936Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:01.987Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.097Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.166Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.209Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.298Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.333Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.402Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.435Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.465Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.596Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.639Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.672Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.713Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.752Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.784Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.835Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.871Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.932Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.959Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:02.990Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:26.653Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:04:47.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:05:13.265Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-07T16:05:13.290Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-07_08_03_55-7219671244377936705 after 600 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_ecc5c3d5-c9e8-4920-9487-c9897e5cb9c9_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-07_07_50_43-9465715162622133418?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-07_08_03_55-7219671244377936705?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_ecc5c3d5-c9e8-4920-9487-c9897e5cb9c9_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 58s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/mn4aoswb4uguw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #635

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/635/display/redirect>

Changes:


------------------------------------------
[...truncated 55.37 KB...]
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2680293 sha256=388ac6ede6fd506ed80504928172ec956bc93591dca7061b72a186474c846e79
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.13 botocore-1.24.13 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.1 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.10.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.1 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646581851.623887/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220306155051624856-8776'
 createTime: '2022-03-06T15:50:58.617669Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-06_07_50_57-14351214615986308892'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0306152557'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-06T15:50:58.617669Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-06_07_50_57-14351214615986308892]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-06_07_50_57-14351214615986308892
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-06_07_50_57-14351214615986308892?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-06_07_50_57-14351214615986308892 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:11.395Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.717Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.747Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.830Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.859Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.888Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.923Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.954Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:12.983Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.010Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.043Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.075Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.145Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.194Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.218Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.304Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.332Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.358Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.398Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.424Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.494Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.519Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:13.554Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:42.152Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:51:59.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:52:24.561Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T15:52:24.620Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-06_07_50_57-14351214615986308892 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 973fec4d9f234ce08ff5632b20c35238 and timestamp: 1646582666.0861292:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0306152557.1646582669.660515/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220306160429661452-4287'
 createTime: '2022-03-06T16:04:36.046258Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-06_08_04_35-6542643857717191750'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0306152557'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-06T16:04:36.046258Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-06_08_04_35-6542643857717191750]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-06_08_04_35-6542643857717191750
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-06_08_04_35-6542643857717191750?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-06_08_04_35-6542643857717191750 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:51.381Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.154Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.188Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.244Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.308Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.346Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.414Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.527Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.564Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.596Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.630Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.706Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.732Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.750Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.828Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.897Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:53.985Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.016Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.055Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.082Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.112Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.138Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.172Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.230Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.265Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:04:54.291Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:05:23.821Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:05:39.135Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:06:00.915Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-06T16:06:00.949Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-06_08_04_35-6542643857717191750 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 31e5a92b7cb442fca85365d3f8bd0e32 and timestamp: 1646583479.6017215:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 104
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 31e5a92b7cb442fca85365d3f8bd0e32 and timestamp: 1646583479.6017215:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 104
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-06_07_50_57-14351214615986308892?project=apache-beam-testing
    self.cleanup()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-06_08_04_35-6542643857717191750?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_7e452f4c-9c25-4b5b-9010-0e4f19097b70_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 30s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pkcau2d47ugqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #634

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/634/display/redirect?page=changes>

Changes:

[rahuliyer573] py: Import beam plugins before starting SdkHarness

[stephen.patel] BEAM-14011 fix s3 filesystem multipart copy

[Valentyn Tymofieiev] Bump numpy bound to include 1.22 and regenerate container deps.

[github-actions] [BEAM-13925] months in date constructor are 0 indexed

[noreply] Merge pull request #16842 from [BEAM-13932][Playground] Container's user

[noreply] Doc updates and blog post for 2.37.0 (#16887)

[noreply] Remove resolved issue in docs + update class path on sample (#17018)

[noreply] [BEAM-14016] Fixed flaky postcommit test (#17009)

[noreply] Remove resolved issue in notebook

[noreply] [BEAM-13947] Add split() and rsplit(), non-deferred column operations on

[noreply] BEAM-14026 - Fixes bug related to Unnesting nested rows in an array


------------------------------------------
[...truncated 55.43 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2680293 sha256=94057d54747194deea583bd377639191fcb82a3e0069849ba1713c9d42dee4ad
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.13 botocore-1.24.13 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.1 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.10.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.1 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646495435.122310/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220305155035123214-1737'
 createTime: '2022-03-05T15:50:41.362740Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-05_07_50_40-16911083459655628945'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0305150502'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-05T15:50:41.362740Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-05_07_50_40-16911083459655628945]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-05_07_50_40-16911083459655628945
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-05_07_50_40-16911083459655628945?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-05_07_50_40-16911083459655628945 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:47.089Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.037Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.056Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.155Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.213Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.241Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.276Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.309Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.355Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.396Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.430Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.480Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.544Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.577Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.642Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.687Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.831Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.862Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.888Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.914Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:48.949Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:49.005Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:49.038Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:50:49.081Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:51:01.388Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:51:39.891Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:52:02.821Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T15:52:02.852Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-05_07_50_40-16911083459655628945 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d410c5bb349b4550b907c4af066527c2 and timestamp: 1646496238.92307:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/dataflow-****.jar in 11 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0305150502.1646496243.193703/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220305160403194588-1552'
 createTime: '2022-03-05T16:04:16.412167Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-05_08_04_15-6178277967823640008'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0305150502'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-05T16:04:16.412167Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-05_08_04_15-6178277967823640008]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-05_08_04_15-6178277967823640008
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-05_08_04_15-6178277967823640008?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-05_08_04_15-6178277967823640008 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:24.975Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:31.898Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:36.921Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:37.724Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.062Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.164Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.200Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.304Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.395Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.487Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.528Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.601Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.624Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.735Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.768Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.819Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.871Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.937Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:38.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.008Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.042Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.077Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.110Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.133Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.192Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.229Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:04:39.278Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:05:24.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:05:24.845Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:05:35.202Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:05:49.138Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-05T16:05:49.164Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-05_08_04_15-6178277967823640008 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0f6260190e6a45b580601e1f339aaae9 and timestamp: 1646497053.4806542:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 107
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0f6260190e6a45b580601e1f339aaae9 and timestamp: 1646497053.4806542:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 107
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-05_07_50_40-16911083459655628945?project=apache-beam-testing
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-05_08_04_15-6178277967823640008?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_be9a9ae6-5f17-467a-ad77-e38710a12c25_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 11s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nmh625h5cee4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #633

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/633/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-13999] playground - support vertical orientation for graph

[noreply] Merge pull request #16879 from [BEAM-12164] Add javadocs to

[noreply] [Cleanup] Update pre-v2 go package references (#17002)

[noreply] [BEAM-13885] Add unit tests to window package (#16971)

[noreply] Merge pull request #16891 from [BEAM-13872] [Playground] Increase test

[noreply] Merge pull request #16912 from [BEAM-13878] [Playground] Increase test

[noreply] Merge pull request #16946 from [BEAM-13873] [Playground] Increase test

[noreply] [BEAM-13951] Update mass_comment.py list of Run commands (#16889)

[noreply] [BEAM-10652] Allow Clustering without Partition in BigQuery (#16578)

[noreply] [BEAM-13857] Add K:V flags for expansion service jars and addresses to


------------------------------------------
[...truncated 55.78 KB...]
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2678703 sha256=0c043c44b28359be8d4d131e7a5af6e60d880ad265479058cde1bc985420a1dc
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.23.0 azure-storage-blob-12.9.0 boto3-1.21.12 botocore-1.24.12 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.1 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.1 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409036.288686/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220304155036289644-8783'
 createTime: '2022-03-04T15:50:43.603418Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-04_07_50_43-11382542961559053908'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0304150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-04T15:50:43.603418Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-04_07_50_43-11382542961559053908]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-04_07_50_43-11382542961559053908
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-04_07_50_43-11382542961559053908?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-04_07_50_43-11382542961559053908 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:47.827Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.058Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.194Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.235Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.262Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.288Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.316Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.352Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.377Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.414Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.467Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.501Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.535Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.569Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.706Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.730Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.747Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.779Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.809Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.863Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.897Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:50:49.941Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:24.023Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:26.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:26.062Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:36.438Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:58.896Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T15:51:58.924Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-04_07_50_43-11382542961559053908 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7e46b5a4f6bd459cb400fe8aeed0a4e1 and timestamp: 1646409836.2763982:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0304150514.1646409839.834757/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220304160359835659-3329'
 createTime: '2022-03-04T16:04:06.091289Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-04_08_04_05-9625951374427566682'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0304150514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-04T16:04:06.091289Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-04_08_04_05-9625951374427566682]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-04_08_04_05-9625951374427566682
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-04_08_04_05-9625951374427566682?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-04_08_04_05-9625951374427566682 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:12.651Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:13.809Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:13.844Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:13.927Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.048Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.095Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.154Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.217Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.256Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.283Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.316Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.348Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.372Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.429Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.508Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.551Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.584Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.617Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.648Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.710Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.746Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.777Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.810Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.845Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.895Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.925Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:14.973Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:34.459Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:04:57.480Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:05:20.821Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-04T16:05:20.848Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-04_08_04_05-9625951374427566682 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: cbc10155bad147929b6e0c40ba16d8f6 and timestamp: 1646410666.136224:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: cbc10155bad147929b6e0c40ba16d8f6 and timestamp: 1646410666.136224:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 142
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-04_07_50_43-11382542961559053908?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-04_08_04_05-9625951374427566682?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_aba538e1-e6b7-4c9d-99c6-bcded839a6a4_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 27s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/n4633bew7irxw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #632

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/632/display/redirect?page=changes>

Changes:

[Alexey Romanenko] Bump org.mongodb:mongo-java-driver to 3.12.10

[noreply] [BEAM-13973] Link Dataproc Flink master URLs to the InteractiveRunner

[noreply] [BEAM-13925] Turn pr bot on for go prs (#16984)

[Pablo Estrada] Skipping flaky sad-path tests for Spanner changestreams

[noreply] [BEAM-13964] Bump kotlin to 1.6.x (#16882)

[noreply] Merge pull request #16906: [BEAM-13974] Handle idle Storage Api streams

[noreply] Merge pull request #16562 from [BEAM-13051][D] Enable pylint warnings

[noreply] [BEAM-13925] A couple small pr-bot bug fixes (#16996)

[noreply] [BEAM-14029] Add getter, setter for target maven repo (#16995)

[noreply] [BEAM-13903] Improve coverage of metricsx package (#16994)

[noreply] [BEAM-13892] Improve coverage of avroio package (#16990)

[noreply] [adhoc] Prepare aws2 ClientConfiguration for json serialization and


------------------------------------------
[...truncated 55.83 KB...]
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2678703 sha256=9261ec2c47cce3008e1a035fed7802ab5db181c08183f223492f66906c3d6a8e
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.11 botocore-1.24.11 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.1 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646322637.152979/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220303155037153905-7777'
 createTime: '2022-03-03T15:50:43.847527Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-03_07_50_43-7740904351671811608'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0303150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-03T15:50:43.847527Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-03_07_50_43-7740904351671811608]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-03_07_50_43-7740904351671811608
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-03_07_50_43-7740904351671811608?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-03_07_50_43-7740904351671811608 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:48.130Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:50.857Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:50.887Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:50.962Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.007Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.048Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.100Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.132Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.175Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.202Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.223Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.258Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.281Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.474Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.509Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.543Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.579Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.604Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.654Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.691Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:50:51.738Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:51:24.400Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:51:29.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:51:29.497Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:51:39.914Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:52:02.572Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T15:52:02.610Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-03_07_50_43-7740904351671811608 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c5657859ab7d459c901972bfa04f9ab5 and timestamp: 1646323431.0900571:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 141
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0303150529.1646323435.970767/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220303160355971759-3268'
 createTime: '2022-03-03T16:04:02.591184Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-03_08_04_02-11641810371504069218'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0303150529'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-03T16:04:02.591184Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-03_08_04_02-11641810371504069218]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-03_08_04_02-11641810371504069218
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-03_08_04_02-11641810371504069218?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-03_08_04_02-11641810371504069218 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:07.112Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.527Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.563Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.626Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.722Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.746Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:09.852Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.012Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.070Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.138Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.233Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.299Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.400Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.542Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.581Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.619Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.650Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.683Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.733Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.776Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:10.803Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:21.109Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:04:59.385Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:05:21.144Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-03T16:05:21.206Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-03_08_04_02-11641810371504069218 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f67386d6e497462d9259cee8b1136ba9 and timestamp: 1646324293.946073:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 149
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f67386d6e497462d9259cee8b1136ba9 and timestamp: 1646324293.946073:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 149
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-03_07_50_43-7740904351671811608?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-03_08_04_02-11641810371504069218?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_29b58395-e5b2-4967-bb31-f1711c6d29ef_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 54s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/luyyz6ne2nwwa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/631/display/redirect?page=changes>

Changes:

[rogelio.hernandez] [BEAM-12777] Removed current docs version redirect

[noreply] Merge pull request #16850: [BEAM-11205] Upgrade Libraries BOM

[noreply] Merge pull request #16484 from [BEAM-13633] [Playground] Implement

[noreply] Add 2022 events blog post (#16975)

[noreply] Clean up Go formatter suggestions (#16973)

[noreply] [BEAM-14012] Add go fmt to Github Actions (#16978)

[noreply] [BEAM-13911] Add basic tests to Go direct runner. (#16979)

[noreply] [BEAM-13960] Add support for more types when converting from between row


------------------------------------------
[...truncated 56.81 KB...]
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2676066 sha256=2ef280e7e1576986960fd77a50a24f125af787dfff08a2acdf9ada9c32e960dc
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.10 botocore-1.24.10 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/dataflow-****.jar in 10 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646236240.989126/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220302155040990024-3610'
 createTime: '2022-03-02T15:50:52.443447Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-02_07_50_51-11746327552573509537'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0302150511'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-02T15:50:52.443447Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-02_07_50_51-11746327552573509537]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-02_07_50_51-11746327552573509537
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-02_07_50_51-11746327552573509537?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-02_07_50_51-11746327552573509537 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:56.639Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.324Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.358Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.420Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.454Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.482Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.507Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.528Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.579Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.605Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.640Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.664Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.713Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.749Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.788Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.863Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.893Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.923Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.959Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:57.983Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:58.042Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:58.073Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:50:58.106Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:51:35.611Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:51:44.413Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:52:09.483Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T15:52:09.556Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-02_07_50_51-11746327552573509537 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8c257fe31d004fcf88637d142b2f4d0e and timestamp: 1646237040.0579813:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 121
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0302150511.1646237044.632832/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220302160404633751-9567'
 createTime: '2022-03-02T16:04:12.474091Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-02_08_04_11-3382623296073569162'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0302150511'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-02T16:04:12.474091Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-02_08_04_11-3382623296073569162]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-02_08_04_11-3382623296073569162
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-02_08_04_11-3382623296073569162?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-02_08_04_11-3382623296073569162 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:17.487Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:18.910Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:18.946Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.004Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.069Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.097Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.166Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.222Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.251Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.271Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.295Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.352Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.374Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.408Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.464Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.498Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.562Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.586Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.638Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.671Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.704Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.737Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.781Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.806Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.862Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.893Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:19.929Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:51.604Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:55.247Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:04:55.272Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:05:05.654Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:05:29.683Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-02T16:05:29.716Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-02_08_04_11-3382623296073569162 after 605 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_6b0cba8f-b0c0-4f3e-9ea3-2259e10e212c_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-02_07_50_51-11746327552573509537?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-02_08_04_11-3382623296073569162?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_6b0cba8f-b0c0-4f3e-9ea3-2259e10e212c_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 12s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qqtpqnsxymwdo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #630

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/630/display/redirect?page=changes>

Changes:

[egalpin] Use default context output rather than outputWithTimestamp for

[stranniknm] Palo Alto case study - fix link

[noreply] Build wheels for Python 3.9

[noreply] Merge pull request #16892 from [BEAM-13755] [Playground] Scroll the

[noreply] Merge pull request #16880 from [BEAM-13963][Playground] Get bucket name

[noreply] Merge pull request #16870 from [BEAM-13874][Playground] Tag multifile

[noreply] Merge pull request #16910 from [BEAM-13724] [Playground] Get the default

[noreply] [BEAM-14008] Fix incorrect guava import (#16966)

[noreply] Fix ignored exception in BatchSpannerRead. (#16960)

[noreply] [BEAM-13917] Improve coverage of databaseio package (#16956)

[noreply] [BEAM-13925] Add entry files to process new prs and pr updates for PR

[noreply] [BEAM-13899] Improve coverage of debug package (#16951)

[noreply] [BEAM-13907] Improve coverage of textio package (#16937)

[noreply] [BEAM-9150] Fix beam_PostRelease_Python_Candidate (python RC validation


------------------------------------------
[...truncated 56.58 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2676066 sha256=4dc17e42c7d3656351045ddf6ce4b5e225d60d47c3fa4bcbdfe2ffb7dddc6e9b
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.9 botocore-1.24.9 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.1 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/dataflow-****.jar in 14 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646149842.857416/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220301155042858340-2756'
 createTime: '2022-03-01T15:50:59.538184Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-01_07_50_58-10108917913532328221'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0301150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-01T15:50:59.538184Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-01_07_50_58-10108917913532328221]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-01_07_50_58-10108917913532328221
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-01_07_50_58-10108917913532328221?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-01_07_50_58-10108917913532328221 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.086Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.812Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.843Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.906Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.946Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:04.986Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.008Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.041Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.080Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.108Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.144Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.181Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.210Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.289Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.424Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.458Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.494Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.529Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.560Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.623Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.652Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:05.703Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:22.026Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:51:50.491Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:52:15.860Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T15:52:15.886Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-01_07_50_58-10108917913532328221 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: bf25249767984d1faeddf116b3cd5e75 and timestamp: 1646150662.4288645:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 135
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0301150521.1646150666.753487/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220301160426754426-3710'
 createTime: '2022-03-01T16:04:33.090853Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-01_08_04_32-14750239134385073294'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0301150521'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-01T16:04:33.090853Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-03-01_08_04_32-14750239134385073294]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-03-01_08_04_32-14750239134385073294
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-01_08_04_32-14750239134385073294?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-03-01_08_04_32-14750239134385073294 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:44.637Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:50.521Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:55.547Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.338Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.438Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.500Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.568Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.621Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.662Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.735Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.791Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.814Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.839Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.862Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.926Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.949Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:56.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.012Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.073Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.108Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.148Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.170Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.190Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.250Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.288Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:04:57.317Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:05:07.414Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:05:42.952Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:06:08.100Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-01T16:06:08.128Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-03-01_08_04_32-14750239134385073294 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_094ffe24-dba7-4a11-bdef-603218249b10_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-01_07_50_58-10108917913532328221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-01_08_04_32-14750239134385073294?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_094ffe24-dba7-4a11-bdef-603218249b10_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 40s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/q5jra5sf4p3na

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #629

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/629/display/redirect>

Changes:


------------------------------------------
[...truncated 56.61 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675962 sha256=6c62faf65ca53039ba926f61ff04a88b48ae7de3507a8a1854d17a24e1237459
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.8 botocore-1.24.8 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646063444.278582/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220228155044279536-6476'
 createTime: '2022-02-28T15:50:49.926827Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-28_07_50_49-7918172963082942430'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0228150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-28T15:50:49.926827Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-28_07_50_49-7918172963082942430]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-28_07_50_49-7918172963082942430
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-28_07_50_49-7918172963082942430?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-28_07_50_49-7918172963082942430 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:55.626Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.400Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.424Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.503Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.543Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.573Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.608Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.640Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.682Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.708Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.743Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.776Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:56.922Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.278Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.311Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.331Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.363Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.390Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.448Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.484Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:50:57.504Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:51:21.024Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:51:44.105Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:52:08.433Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T15:52:08.462Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-28_07_50_49-7918172963082942430 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 90025e657a384d43b25ea8f2153b998e and timestamp: 1646064257.8152697:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 87
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0228150536.1646064262.129738/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220228160422130636-6218'
 createTime: '2022-02-28T16:04:28.435677Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-28_08_04_27-13665835526733684732'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0228150536'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-28T16:04:28.435677Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-28_08_04_27-13665835526733684732]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-28_08_04_27-13665835526733684732
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-28_08_04_27-13665835526733684732?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-28_08_04_27-13665835526733684732 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:33.225Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.434Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.466Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.521Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.601Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.669Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.805Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.885Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.926Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:34.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.011Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.046Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.073Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.096Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.172Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.192Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.219Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.254Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.284Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.320Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.350Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.394Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.424Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.457Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.493Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.540Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.568Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:04:35.600Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:05:05.753Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:05:20.585Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:05:44.971Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-28T16:05:45.001Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-28_08_04_27-13665835526733684732 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_409c4c52-6363-4e48-946b-e93792b01845_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-28_07_50_49-7918172963082942430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-28_08_04_27-13665835526733684732?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_409c4c52-6363-4e48-946b-e93792b01845_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 17s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lrhc6c5vtcews

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #628

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/628/display/redirect>

Changes:


------------------------------------------
[...truncated 56.14 KB...]
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675962 sha256=92fc9c6b4694252265ad093da72de71d9fefa93f6ef781f615af37672a7f2529
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.8 botocore-1.24.8 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977025.747262/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220227155025748247-3792'
 createTime: '2022-02-27T15:50:31.691986Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-27_07_50_31-5929707858288084147'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0227150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-27T15:50:31.691986Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-27_07_50_31-5929707858288084147]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-27_07_50_31-5929707858288084147
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-27_07_50_31-5929707858288084147?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-27_07_50_31-5929707858288084147 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:37.779Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.651Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.676Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.753Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.784Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.818Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.843Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.874Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.918Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.953Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:38.989Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.023Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.058Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.090Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.258Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.283Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.312Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.355Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.390Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.452Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.469Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:39.516Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:50:50.501Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:51:22.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:51:46.807Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T15:51:46.839Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-27_07_50_31-5929707858288084147 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 19d7d7dd9acc4d859828ce3f6ad0cafb and timestamp: 1645977819.0675454:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 104
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0227150516.1645977823.931534/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220227160343932442-6939'
 createTime: '2022-02-27T16:03:50.598337Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-27_08_03_49-3058024952707373447'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0227150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-27T16:03:50.598337Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-27_08_03_49-3058024952707373447]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-27_08_03_49-3058024952707373447
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-27_08_03_49-3058024952707373447?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-27_08_03_49-3058024952707373447 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:01.775Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.609Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.641Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.685Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.737Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.764Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.806Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.849Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.879Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:02.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.079Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.168Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.240Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.272Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.303Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.334Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.372Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.411Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.437Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.459Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.522Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.539Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:03.573Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:34.627Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:37.008Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:37.042Z: JOB_MESSAGE_DETAILED: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:04:47.393Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:05:12.001Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-27T16:05:12.028Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-27_08_03_49-3058024952707373447 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_6d1affcd-5159-4536-b6fe-fdb6a0da28f3_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-27_07_50_31-5929707858288084147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-27_08_03_49-3058024952707373447?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_6d1affcd-5159-4536-b6fe-fdb6a0da28f3_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 55s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tmdxkzxwge35c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #627

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/627/display/redirect?page=changes>

Changes:

[Ankur Goenka] [BEAM-13952] Sickbaying

[noreply] Memoize some objects for timer processing to reduce overhead. (#16207)

[noreply] [BEAM-13965] Use TypeDeserializer if type information is available to

[noreply] [BEAM-13912] Add more coverage for dataflow.go (#16903)

[noreply] [BEAM-12563] swaplevel general function for dataframe and series

[noreply] [BEAM-14001] Update coder.go unit tests (#16952)

[noreply] [BEAM-13910] Improve coverage of gcsx package (#16942)

[noreply] [BEAM-13015] Use a DirectExecutor for state since we are just completing


------------------------------------------
[...truncated 55.51 KB...]
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675962 sha256=f8fe373d8940c37fe72e8a18a90c1c4145a23bf38f688a02a56401f1cd0adc87
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.8 botocore-1.24.8 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.3.1 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645890635.179784/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220226155035180706-1838'
 createTime: '2022-02-26T15:50:42.348036Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-26_07_50_41-4724487586830618261'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0226150524'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-26T15:50:42.348036Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-26_07_50_41-4724487586830618261]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-26_07_50_41-4724487586830618261
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-26_07_50_41-4724487586830618261?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-26_07_50_41-4724487586830618261 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:48.201Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.480Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.526Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.608Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.649Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.678Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.702Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.728Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.770Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.804Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.837Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.873Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:49.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.098Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.129Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.168Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.208Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.246Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.310Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.331Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:50:50.387Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:51:04.333Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:51:31.680Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:51:56.905Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T15:51:56.931Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-26_07_50_41-4724487586830618261 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 02f3b1896a5a47e4bf84cd9e5747ce32 and timestamp: 1645891434.9481552:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0226150524.1645891438.643285/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220226160358644161-2705'
 createTime: '2022-02-26T16:04:06.312208Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-26_08_04_05-16434339966832439962'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0226150524'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-26T16:04:06.312208Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-26_08_04_05-16434339966832439962]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-26_08_04_05-16434339966832439962
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-26_08_04_05-16434339966832439962?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-26_08_04_05-16434339966832439962 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:11.994Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:12.842Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:12.879Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:12.936Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:12.981Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.008Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.073Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.139Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.189Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.335Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.388Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.423Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.468Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.510Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.543Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.590Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.611Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.652Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.683Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.713Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.736Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.762Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.821Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.843Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:13.877Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:33.162Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:04:57.121Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:05:20.681Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-26T16:05:20.710Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-26_08_04_05-16434339966832439962 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 27b8fc997c5e46c49312051ed7e0abb1 and timestamp: 1645892415.0524724:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 336
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 27b8fc997c5e46c49312051ed7e0abb1 and timestamp: 1645892415.0524724:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 336
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-26_07_50_41-4724487586830618261?project=apache-beam-testing
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-26_08_04_05-16434339966832439962?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_ca19a863-ad06-4e0e-a415-82ea7569385b_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ujsclop4rh2co

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #626

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/626/display/redirect?page=changes>

Changes:

[benjamin.gonzalez] Revert PR#16253 due errors with plugin flaky-test-handler

[noreply] Fix BoundedQueueExecutor and StreamingDataflowWorker to actually limit

[noreply] [BEAM-1857] Add Neo4jIO (#15916)

[noreply] [BEAM-13767] Migrate serveral portable runner tasks to use configuration

[noreply] [BEAM-13996] Removing 'No cluster_manager is associated with the

[noreply] [BEAM-13906] Improve coverage of errors package (#16934)

[noreply] [BEAM-13886] unit tests for trigger package (#16935)

[noreply] [BEAM-4767] Remove beam- prefix from release script tags (#16899)

[noreply] [BEAM-13866] Add small unit tests to errorx, make boolean assignment

[noreply] [BEAM-13925] Add most of the supporting files for the pr management

[noreply] Merge pull request #16846 from [BEAM-12164]: Add sad path tests for


------------------------------------------
[...truncated 56.63 KB...]
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675832 sha256=6a253891ce5922a653da9430e8322019f3ea9236627953bb2a2ae7f192018350
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.7 botocore-1.24.7 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.2 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645804235.080367/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220225155035081314-7140'
 createTime: '2022-02-25T15:50:41.929228Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-25_07_50_41-2359226830683459320'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0225150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-25T15:50:41.929228Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-25_07_50_41-2359226830683459320]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-25_07_50_41-2359226830683459320
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-25_07_50_41-2359226830683459320?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-25_07_50_41-2359226830683459320 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:50.562Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.499Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.539Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.604Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.646Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.693Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.724Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.755Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.798Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.854Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.896Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.934Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.974Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:53.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.032Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.172Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.203Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.257Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.289Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.323Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.379Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.510Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:50:54.548Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:51:01.211Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:51:44.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:52:10.858Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T15:52:10.890Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-25_07_50_41-2359226830683459320 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5d7608908fde4b5d802fedababad7285 and timestamp: 1645805020.2920804:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 116
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0225150510.1645805023.630576/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220225160343631476-1640'
 createTime: '2022-02-25T16:03:50.177250Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-25_08_03_49-4770515018172207499'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0225150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-25T16:03:50.177250Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-25_08_03_49-4770515018172207499]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-25_08_03_49-4770515018172207499
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-25_08_03_49-4770515018172207499?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-25_08_03_49-4770515018172207499 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:56.512Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.475Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.503Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.581Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.695Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.760Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.858Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.896Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.936Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:57.993Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.161Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.295Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.371Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.411Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.442Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.477Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.499Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.533Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.615Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.642Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:03:58.683Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:04:11.135Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:04:38.624Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:05:03.631Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-25T16:05:03.658Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-25_08_03_49-4770515018172207499 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_257b814c-d50e-40af-9f11-7034dd12adb5_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-25_07_50_41-2359226830683459320?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-25_08_03_49-4770515018172207499?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_257b814c-d50e-40af-9f11-7034dd12adb5_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 49s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/g2qf3ersj2aie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #625

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/625/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-13796] projection pushdown in BQ IO

[Kyle Weaver] [BEAM-13796] Move test to ReadTest class and correct javadoc for

[Kyle Weaver] [BEAM-13796] Pushdown is not supported on TypedRead#fromQuery.

[noreply] [BEAM-13738] Reenable ignored SQS test after bumping elasticmq for fixed

[noreply] fix build status link (#16907)

[noreply] Merge pull request #16549 from [BEAM-13681][Playground] Refactoring

[noreply] Merge pull request #16732 from [BEAM-13825] [Playground] updated

[noreply] Merge pull request #16683 from [BEAM-13713][Playground] Java graph

[noreply] case study pages - improvements and fixes (#16896)

[noreply] Palo Alto case study (#16915)

[noreply] [BEAM-12645] Fix code-cov flakes due to monorepo. (#16925)

[noreply] [BEAM-13969] Deprecate stringx package (#16884)

[noreply] Add Go badge to ReadMe (#16897)

[noreply] [BEAM-13980] Re-add method gone missing in af2f8ee6 (#16918)

[noreply] [BEAM-13884] Improve mtime package (#16924)

[noreply] Minor: Update Go API doc links (#16932)

[noreply] [BEAM-13218] Re-enable

[noreply] Merge pull request #16857 from [BEAM-13662] [Playground] Support

[noreply] Merge pull request #16826 from [BEAM-13870] [Playground] Increase test


------------------------------------------
[...truncated 56.74 KB...]
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675920 sha256=cbd187a5203c7a091cd0ac147e1a858f1808d76ba7699def5a04c89c821ec2f7
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.6 botocore-1.24.6 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.0 googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645717913.371557/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220224155153373144-6570'
 createTime: '2022-02-24T15:52:01.214194Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-24_07_52_00-4623073973927747043'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0224150550'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-24T15:52:01.214194Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-24_07_52_00-4623073973927747043]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-24_07_52_00-4623073973927747043
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-24_07_52_00-4623073973927747043?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-24_07_52_00-4623073973927747043 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:07.214Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.294Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.360Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.404Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.441Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.473Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.508Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.552Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.625Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.660Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.728Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.764Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.795Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.898Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.932Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.959Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:13.993Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:14.030Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:14.106Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:14.146Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:14.194Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:52:23.245Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:53:00.310Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:53:25.043Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T15:53:25.075Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-24_07_52_00-4623073973927747043 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 3c76ca1ab5224b819cfc9c60bd822ec9 and timestamp: 1645718722.6657448:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 107
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/dataflow-****.jar in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0224150550.1645718726.985732/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220224160526986691-5143'
 createTime: '2022-02-24T16:05:35.362290Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-24_08_05_34-11281013861495597201'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0224150550'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-24T16:05:35.362290Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-24_08_05_34-11281013861495597201]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-24_08_05_34-11281013861495597201
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-24_08_05_34-11281013861495597201?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-24_08_05_34-11281013861495597201 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:42.693Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:43.846Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:43.875Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:43.941Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.012Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.042Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.121Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.188Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.213Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.248Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.268Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.301Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.388Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.557Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.590Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.648Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.678Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.711Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.759Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.789Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.846Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.880Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:05:44.911Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:06:12.213Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:06:31.616Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:06:58.383Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-24T16:06:58.429Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-24_08_05_34-11281013861495597201 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8e0cfa316c1543e293574e682ee0a869 and timestamp: 1645719617.0905201:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 176
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8e0cfa316c1543e293574e682ee0a869 and timestamp: 1645719617.0905201:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 176
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-24_07_52_00-4623073973927747043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-24_08_05_34-11281013861495597201?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_918b0704-845c-431d-b09d-335d4f2f97e7_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vo5pxdqfuys5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #623

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/623/display/redirect>

Changes:


------------------------------------------
[...truncated 55.60 KB...]
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675864 sha256=a5898843da87041639aec1e438493793989f25808d41d52d813cfb650f27e286
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.4 botocore-1.24.4 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.12.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545042.666475/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220222155042667381-9373'
 createTime: '2022-02-22T15:50:48.790865Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-22_07_50_47-3013033407226126730'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0222152349'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-22T15:50:48.790865Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-22_07_50_47-3013033407226126730]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-22_07_50_47-3013033407226126730
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-22_07_50_47-3013033407226126730?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-22_07_50_47-3013033407226126730 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:53.458Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.557Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.580Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.634Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.688Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.722Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.755Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.788Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.841Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.894Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.960Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:54.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.021Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.050Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.075Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.321Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.353Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.400Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.449Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.484Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.547Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.589Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:50:55.629Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:51:03.711Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:51:40.613Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:52:03.737Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T15:52:03.765Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-22_07_50_47-3013033407226126730 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 7fe4fdae2ee7425193523f4cf00f3521 and timestamp: 1645545852.1576464:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 148
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0222152349.1645545855.764913/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220222160415765829-6723'
 createTime: '2022-02-22T16:04:21.479441Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-22_08_04_21-3791021310434319447'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0222152349'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-22T16:04:21.479441Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-22_08_04_21-3791021310434319447]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-22_08_04_21-3791021310434319447
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-22_08_04_21-3791021310434319447?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-22_08_04_21-3791021310434319447 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:27.204Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.157Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.188Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.254Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.322Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.373Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.433Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.498Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.528Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.597Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.628Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.654Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.676Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.697Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.718Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.751Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:28.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.019Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.085Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.142Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.208Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.270Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.307Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.335Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.372Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.424Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.472Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:04:29.501Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:05:04.677Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:05:14.421Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:05:43.279Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-22T16:05:43.306Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-22_08_04_21-3791021310434319447 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b3c61359e18f4c53b4e922e17bce6b22 and timestamp: 1645546670.4265056:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 117
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b3c61359e18f4c53b4e922e17bce6b22 and timestamp: 1645546670.4265056:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 117
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-22_07_50_47-3013033407226126730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-22_08_04_21-3791021310434319447?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_fd314d3c-518a-4b87-9712-a9f162a586ea_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 22s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/f46i26fwnqrv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #622

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/622/display/redirect>

Changes:


------------------------------------------
[...truncated 56.81 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675864 sha256=68d7614a22fedad74a8d1b387de5597403af7b50de1b575bd4a6d24dabe23991
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.3 botocore-1.24.3 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466030.911121/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220221175350912691-1388'
 createTime: '2022-02-21T17:53:57.503146Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-21_09_53_57-11885003464483037806'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0219185812'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-21T17:53:57.503146Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-21_09_53_57-11885003464483037806]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-21_09_53_57-11885003464483037806
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-21_09_53_57-11885003464483037806?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-21_09_53_57-11885003464483037806 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:01.947Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.686Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.713Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.772Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.817Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.856Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.890Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.925Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.967Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:02.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.039Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.061Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.095Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.132Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.156Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.208Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.302Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.326Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.355Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.392Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.415Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.462Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.490Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:03.528Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:18.671Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:54:47.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:55:14.364Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T17:55:14.393Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-21_09_53_57-11885003464483037806 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8874ecb6b75940728d913d205ddbf8da and timestamp: 1645466829.9617379:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 101
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219185812.1645466834.376095/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220221180714377054-7319'
 createTime: '2022-02-21T18:07:21.291434Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-21_10_07_20-14369273131257318554'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0219185812'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-21T18:07:21.291434Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-21_10_07_20-14369273131257318554]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-21_10_07_20-14369273131257318554
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-21_10_07_20-14369273131257318554?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-21_10_07_20-14369273131257318554 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:28.388Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.325Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.359Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.424Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.511Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.531Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.593Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.650Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.684Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.734Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.770Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.802Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.835Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.869Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:29.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.182Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.218Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.247Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.269Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.300Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.365Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.398Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:30.432Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:07:55.511Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:08:15.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:08:40.752Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-21T18:08:40.787Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-21_10_07_20-14369273131257318554 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_28ab555f-53d3-4a57-822b-fea60a1d4ee7_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-21_09_53_57-11885003464483037806?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-21_10_07_20-14369273131257318554?project=apache-beam-testing
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_28ab555f-53d3-4a57-822b-fea60a1d4ee7_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 12s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7bnyjtidmqpx2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #621

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/621/display/redirect?page=changes>

Changes:

[rogelio.hernandez] [BEAM-13051] Pylint misplaced-bare-raise warning enabled

[rogelio.hernandez] [BEAM-13051] Added descriptions to Kinesis and PortableRunner exceptions

[thiagotnunes] fix: fix bug when retrieving either string or json

[noreply] [BEAM-13812] Integrate DataprocClusterManager into Interactive

[noreply] [BEAM-12572] Fix failing python examples tests in Dataflow runner

[noreply] Remove build status from PR (#16902)


------------------------------------------
[...truncated 55.97 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2675864 sha256=c7f4fe35453f384fc34780d2f3dc5bc5671d8ea78564f9478cb56d9d3235cf29
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.3 botocore-1.24.3 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.34.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.4.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645285821.140664/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220219155021141619-9556'
 createTime: '2022-02-19T15:50:27.148756Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-19_07_50_26-1368481568639836688'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0219151239'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-19T15:50:27.148756Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-19_07_50_26-1368481568639836688]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-19_07_50_26-1368481568639836688
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-19_07_50_26-1368481568639836688?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-19_07_50_26-1368481568639836688 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:34.231Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:39.897Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.005Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.056Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.089Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.123Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.157Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.190Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.233Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.264Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.289Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.322Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.408Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.434Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.562Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.595Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.626Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.655Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.712Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.743Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:40.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:50:59.494Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:51:27.923Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:51:54.958Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T15:51:54.993Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-19_07_50_26-1368481568639836688 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 0019e6177df945d1be9ec3c577c7b787 and timestamp: 1645286601.2635722:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 105
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0219151239.1645286607.792038/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220219160327792947-2198'
 createTime: '2022-02-19T16:03:33.236573Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-19_08_03_32-6421470501221484683'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0219151239'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-19T16:03:33.236573Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-19_08_03_32-6421470501221484683]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-19_08_03_32-6421470501221484683
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-19_08_03_32-6421470501221484683?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-19_08_03_32-6421470501221484683 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:42.759Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:48.537Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:53.558Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:54.779Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:54.849Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:54.867Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:54.924Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:54.990Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.022Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.056Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.122Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.215Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.248Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.282Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.314Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.498Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.530Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.560Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.594Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.648Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.708Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.735Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:03:55.788Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:04:20.182Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:04:40.439Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:05:07.614Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-19T16:05:07.652Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-19_08_03_32-6421470501221484683 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_9802e8dd-a963-41b9-99b4-64af18d5baf8_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-19_07_50_26-1368481568639836688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-19_08_03_32-6421470501221484683?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_9802e8dd-a963-41b9-99b4-64af18d5baf8_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 51s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qggvog377s3j4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #620

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/620/display/redirect?page=changes>

Changes:

[Pablo Estrada] Simplify README for new users

[laraschmidt] Fix final allowskew error to properly handle a large allowedSkew

[noreply] [BEAM-13946] Add get_dummies(), a non-deferred column operation on

[noreply] [release-2.36.0] Fix pickler argument for 2.36 blog (#16774)

[noreply] [adhoc] Avoid using SerializablePipelineOptions for testing to minimize


------------------------------------------
[...truncated 56.21 KB...]
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2661809 sha256=05dc841628b541a941fa71a0019ee8f018aebf8be89d0176709beca9e19f3ec1
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.2 botocore-1.24.2 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.33.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645199677.075814/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220218155437076726-5068'
 createTime: '2022-02-18T15:54:43.823890Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-18_07_54_43-7183872629207638110'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0218152527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-18T15:54:43.823890Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-18_07_54_43-7183872629207638110]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-18_07_54_43-7183872629207638110
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-18_07_54_43-7183872629207638110?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-18_07_54_43-7183872629207638110 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:48.115Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:48.900Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:48.920Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:48.963Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:48.993Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.016Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.039Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.073Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.104Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.156Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.182Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.208Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.232Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.254Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.278Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.396Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.423Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.450Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.474Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.531Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.559Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:54:49.584Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:55:10.209Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:55:35.710Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:55:57.488Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T15:55:57.519Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-18_07_54_43-7183872629207638110 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: eff0318f7bb346a3ab2c10e27dc9451c and timestamp: 1645200461.6275446:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0218152527.1645200465.142874/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220218160745143783-1508'
 createTime: '2022-02-18T16:07:51.766880Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-18_08_07_51-5277058258234709698'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0218152527'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-18T16:07:51.766880Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-18_08_07_51-5277058258234709698]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-18_08_07_51-5277058258234709698
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-18_08_07_51-5277058258234709698?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-18_08_07_51-5277058258234709698 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:57.414Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.331Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.364Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.413Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.487Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.519Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.577Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.643Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.681Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.748Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.781Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.818Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.859Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.891Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.925Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:58.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.016Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.050Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.085Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.116Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.184Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.220Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.252Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.289Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.338Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.370Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:07:59.412Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:08:20.129Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:08:40.316Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:09:06.088Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-18T16:09:06.119Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-18_08_07_51-5277058258234709698 after 600 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_0c90810d-b106-4f95-aa88-252330674c54_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-18_07_54_43-7183872629207638110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-18_08_07_51-5277058258234709698?project=apache-beam-testing
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_0c90810d-b106-4f95-aa88-252330674c54_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 35s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7czko5xlfbpok

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #619

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/619/display/redirect?page=changes>

Changes:

[Jeff Tapper] Update Java LTS roadmap info on website for Java 17

[Kyle Weaver] [BEAM-13106] Support Flink 1.14.

[Kyle Weaver] [BEAM-13106] Reuse executor instead of shutting it down mid-test.

[Kyle Weaver] [BEAM-13106] Prevent infinite wait in Flink savepoint test.

[Kenneth Knowles] Disable AfterSynchronizedProcessingTime test on Dataflow

[Kyle Weaver] [BEAM-13106] A couple additional fixes to FlinkSavepointTest.

[mmack] [adhoc] Migrate KinesisIOIT to use ITEnvironment for Localstack based IT

[noreply] [BEAM-13955] Fix pylint breakage from #16836 (#16867)

[relax] Fix TableRow conversion for the case of fields named "f"

[noreply] Bump dataflow.fnapi_container_version (#16874)

[mmack] [BEAM-13563] Introducing common AWS ClientBuilderFactory to unify

[noreply] Case studies page improvements (#16702)


------------------------------------------
[...truncated 56.14 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2660304 sha256=8a2f8fab43e97c8bb9399ff9261858ab6276addff228ed0acaf1721458ad09f5
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.2 botocore-1.24.2 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.33.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.2 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645130504.303912/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220217204144304834-9826'
 createTime: '2022-02-17T20:41:51.430034Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-17_12_41_50-7128262364623807974'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0217195303'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-17T20:41:51.430034Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-17_12_41_50-7128262364623807974]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-17_12_41_50-7128262364623807974
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_12_41_50-7128262364623807974?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-17_12_41_50-7128262364623807974 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:55.857Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.571Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.599Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.668Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.717Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.743Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.772Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.790Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.821Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.845Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.871Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.899Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.926Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:56.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.155Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.193Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.223Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.254Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.312Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.347Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:41:57.378Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:42:24.378Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:42:40.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:43:04.947Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:43:04.980Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-17_12_41_50-7128262364623807974 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: b93eb637f4404370bd26cd7d6d692ae1 and timestamp: 1645131293.596195:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 183
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0217195303.1645131298.525875/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220217205458526811-5022'
 createTime: '2022-02-17T20:55:06.772960Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-17_12_55_05-15250429399334545975'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0217195303'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-17T20:55:06.772960Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-17_12_55_05-15250429399334545975]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-17_12_55_05-15250429399334545975
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_12_55_05-15250429399334545975?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-17_12_55_05-15250429399334545975 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:13.788Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.565Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.595Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.663Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.749Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.778Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.834Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.898Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.928Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.966Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:14.999Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.033Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.085Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.139Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.174Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.206Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.281Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.305Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.338Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.358Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.427Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.457Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.478Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.501Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.552Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.583Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:15.632Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:30.656Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:55:59.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:56:20.903Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-17T20:56:20.934Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-17_12_55_05-15250429399334545975 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 12f12b533f7e44e5ac13ed6b8a74b41a and timestamp: 1645132203.3429167:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 363
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 12f12b533f7e44e5ac13ed6b8a74b41a and timestamp: 1645132203.3429167:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 363
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_12_41_50-7128262364623807974?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_12_55_05-15250429399334545975?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_c5477f9e-4bdf-4974-bcce-9a6093e86c41_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 41s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zbc567yovghis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #618

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/618/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12712] Spark: Exclude looping timer tests.

[Kyle Weaver] [BEAM-13919] Annotate PerKeyOrderingTest with UsesStatefulParDo.

[noreply] Update 2.36.0 blog post to mention ARM64 support

[stranniknm] [BEAM-13785] playground - enable scio sdk

[noreply] Minor: Disable checker framework in nightly snapshot (#16829)

[artur.khanin] Updated example link

[noreply] [BEAM-13860] Make `DoFn.infer_output_type` return element type (#16788)

[noreply] [BEAM-13894] Unit test utilities in the ptest package (#16830)

[Kenneth Knowles] Add test for processing time continuation trigger

[noreply] [BEAM-13922] [Coverage] Make boot.go more testable and add tests

[noreply] Exclude SpannerChangeStream IT from Dataflow V1 postcommit (#16851)

[noreply] [BEAM-13930] Address StateSpec consistency issue between Runner and Fn

[Ismaël Mejía] [BEAM-13202] Fix typos on tests names for VarianceFnTest

[mattcasters] [BEAM-13854] Document casting trick for Avro value serializer in KafkaIO

[Ismaël Mejía] [BEAM-13202] Add Coder to CountIfFn.Accum

[Ismaël Mejía] [BEAM-13202] Reuse Count transform code since CountIf is a specific case

[noreply] Merge pull request #16838 from [BEAM-13931] - make sure large rows cause

[Kenneth Knowles] Add test category UsesProcessingTimeTimers

[Kenneth Knowles] Label tests that need UsesProcessingTimeTimers

[Kenneth Knowles] Exclude UsesProcessingTimeTimers from SamzaRunner tests

[noreply] Seznam Case Study (#16825)

[noreply] [Website] Apache Hop Case Study (#16824)

[noreply] [BEAM-13694] Force hadoop-hdfs-client in hadoopVersion tests for hdfs

[noreply] [Website] Ricardo - added case study feedback (#16807)

[noreply] Merge pull request #16735 from [BEAM-13827] - fix medium file size

[noreply] Merge pull request #16753 from [BEAM-13837] [Playground] show graph on


------------------------------------------
[...truncated 56.31 KB...]
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2660308 sha256=2430082c9c37bb9104dc6b0a94482c084fccf542b8128baf6cd638c037af96c9
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.21.0 botocore-1.24.0 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.1 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645026734.504134/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220216155214505034-7081'
 createTime: '2022-02-16T15:52:20.443267Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-16_07_52_20-17494883338203588634'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0216151505'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-16T15:52:20.443267Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-16_07_52_20-17494883338203588634]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-16_07_52_20-17494883338203588634
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-16_07_52_20-17494883338203588634?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-16_07_52_20-17494883338203588634 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:24.587Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.286Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.320Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.385Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.424Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.444Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.474Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.499Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.550Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.611Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.633Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.688Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.716Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.850Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.880Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.912Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.937Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:25.986Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:26.013Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:26.047Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:38.123Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:58.919Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:52:58.944Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:53:09.167Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:53:30.335Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T15:53:30.368Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-16_07_52_20-17494883338203588634 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 38c215382dc74d5b94ba0b72fd7e65ba and timestamp: 1645027513.2852328:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 147
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0216151505.1645027518.679022/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220216160518679923-4302'
 createTime: '2022-02-16T16:05:25.038516Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-16_08_05_24-17558860662151087410'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0216151505'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-16T16:05:25.038516Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-16_08_05_24-17558860662151087410]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-16_08_05_24-17558860662151087410
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-16_08_05_24-17558860662151087410?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-16_08_05_24-17558860662151087410 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:31.331Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.251Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.277Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.334Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.404Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.439Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.518Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.588Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.631Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.736Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.761Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.824Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.856Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.884Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.918Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:32.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.056Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.112Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.145Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.229Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.267Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.301Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.334Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.382Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.415Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:05:33.462Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:02.266Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:07.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:07.556Z: JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:17.737Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:40.600Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-16T16:06:40.638Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-16_08_05_24-17558860662151087410 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_6951886f-54ab-4841-aab7-b681d8dcf407_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-16_07_52_20-17494883338203588634?project=apache-beam-testing
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-16_08_05_24-17558860662151087410?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_6951886f-54ab-4841-aab7-b681d8dcf407_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 4s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/hgwng4bzkbov6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #616

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/616/display/redirect?page=changes>

Changes:

[akustov] fix name project id from secreton scio deploy action

[alexander.zhuravlev] [BEAM-13775] Fixed bug with run button

[ihr] [BEAM-13836] Fix the answers placeholders locations in the Python katas

[noreply] Merge pull request #16703 from [BEAM-13804][Playground][Bugfix] Add

[noreply] Merge pull request #16611 from [BEAM-13712][Playground] Add graph for

[noreply] Merge pull request #16757 from [BEAM-13655] [Playground] Persist the


------------------------------------------
[...truncated 19.69 KB...]
  Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.2.1-py2.py3-none-any.whl (75 kB)
Requirement already satisfied: packaging>=14.3 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<3,>=1.6.0->apache-beam==2.38.0.dev0) (21.3)
Collecting libcst>=0.2.5
  Using cached libcst-0.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.7 MB)
Collecting google-api-core[grpc]<3.0.0dev,>=1.29.0
  Using cached google_api_core-1.31.5-py2.py3-none-any.whl (93 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.3-py3-none-any.whl
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-auth<3,>=1.18.0->apache-beam==2.38.0.dev0) (60.9.0)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.1.0-py3-none-any.whl (14 kB)
Collecting grpcio-status>=1.18.0
  Using cached grpcio_status-1.43.0-py3-none-any.whl (10.0 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing<3,>=2.4.2
  Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting pbr>=0.11
  Using cached pbr-5.8.1-py2.py3-none-any.whl (113 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (2.1.3)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (1.11.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.12.0-py3-none-any.whl (54 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2659546 sha256=8a6495dabfde7f3222ea46d005c36b37dcd0a1723c92b56d019c9f8f0abee722
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.20.54 botocore-1.23.54 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644853900.274440/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220214155140275326-5888'
 createTime: '2022-02-14T15:51:46.421700Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-14_07_51_46-15288635667085931887'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0214125654'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-14T15:51:46.421700Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-14_07_51_46-15288635667085931887]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-14_07_51_46-15288635667085931887
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_07_51_46-15288635667085931887?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-14_07_51_46-15288635667085931887 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:52.211Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.268Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.300Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.372Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.405Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.440Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.469Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.513Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.557Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.584Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.615Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.669Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.696Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.733Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.788Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.936Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.964Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:53.998Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:54.039Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:54.093Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:54.114Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:51:54.143Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:52:10.825Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:52:37.443Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:53:01.639Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T15:53:01.677Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-14_07_51_46-15288635667085931887 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8790a60644254e54b6a791c9da5e7c54 and timestamp: 1644854683.7612848:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 95
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0214125654.1644854687.454406/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220214160447455301-5229'
 createTime: '2022-02-14T16:04:54.738100Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-14_08_04_54-15469906068508099727'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0214125654'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-14T16:04:54.738100Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-14_08_04_54-15469906068508099727]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-14_08_04_54-15469906068508099727
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_08_04_54-15469906068508099727?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-14_08_04_54-15469906068508099727 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:01.153Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.231Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.266Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.338Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.408Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.437Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.528Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.595Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.651Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.676Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.723Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.744Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.773Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.807Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:02.971Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.073Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.151Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.172Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.212Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.241Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.306Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.347Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:03.372Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:37.494Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:05:47.255Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:06:11.315Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-14T16:06:11.352Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-14_08_04_54-15469906068508099727 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fa137f460d294bc3a5584cb7e8321fe4 and timestamp: 1644855491.1758497:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: fa137f460d294bc3a5584cb7e8321fe4 and timestamp: 1644855491.1758497:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 121
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_07_51_46-15288635667085931887?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_08_04_54-15469906068508099727?project=apache-beam-testing
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_28a7cd00-918d-4bdb-ba57-35a63b8528f1_read'

FATAL: Java heap space
java.lang.OutOfMemoryError: Java heap space

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #615

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/615/display/redirect>

Changes:


------------------------------------------
[...truncated 55.70 KB...]
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2659546 sha256=816aa5926a4fe283ac8930de2f1a1e4f402846fdce8c20f7bd0488498f5b5272
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.20.54 botocore-1.23.54 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.1.0 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644767466.962413/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220213155106963308-3258'
 createTime: '2022-02-13T15:51:13.057413Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-13_07_51_12-3731372745037595696'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0213150448'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-13T15:51:13.057413Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-13_07_51_12-3731372745037595696]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-13_07_51_12-3731372745037595696
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-13_07_51_12-3731372745037595696?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-13_07_51_12-3731372745037595696 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:19.059Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:19.981Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.093Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.304Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.394Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.418Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.446Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.510Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.538Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.563Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.596Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.629Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.686Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.877Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.908Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.931Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:20.963Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:21.011Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:21.050Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:21.087Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:51:57.869Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:52:00.636Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:52:25.889Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T15:52:25.949Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-13_07_51_12-3731372745037595696 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ff1d2243020f4a66a2261f90a37ad4bc and timestamp: 1644768311.0457594:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 105
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0213150448.1644768316.905691/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220213160516906674-9127'
 createTime: '2022-02-13T16:05:22.963169Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-13_08_05_22-8354891653251235693'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0213150448'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-13T16:05:22.963169Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-13_08_05_22-8354891653251235693]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-13_08_05_22-8354891653251235693
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-13_08_05_22-8354891653251235693?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-13_08_05_22-8354891653251235693 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:27.332Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.086Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.110Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.167Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.220Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.251Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.296Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.349Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.377Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.422Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.456Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.478Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.508Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.542Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.662Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.704Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.737Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.804Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.835Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.868Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.900Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.956Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:28.996Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:29.032Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:05:56.282Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:06:07.271Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:06:07.300Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:06:17.622Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:06:37.945Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-13T16:06:37.968Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-13_08_05_22-8354891653251235693 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 13efec7e00e34d9182077babdc372bed and timestamp: 1644769130.5197916:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 81
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 13efec7e00e34d9182077babdc372bed and timestamp: 1644769130.5197916:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 81
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-13_07_51_12-3731372745037595696?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-13_08_05_22-8354891653251235693?project=apache-beam-testing
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_00d724e1-9ba2-4fda-b691-ebaf137c684e_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 53s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ga2kgaw32wgbc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #614

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/614/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13834] Increase influxDB persistent storage. (#16817)

[noreply] Minor: Fix link to nexmark benchmarks (#16803)

[noreply] Regenerate python container base_image_requirements.txt (#16832)

[randomstep] [BEAM-9195] Bump org.testcontainers to 1.16.3


------------------------------------------
[...truncated 55.08 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2659546 sha256=a4fe8c045c6270b4b80ef12da6a6364e3af163918ed52ecad5ddaf70bb0c046e
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0

> Task :runners:google-cloud-dataflow-java:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.20.54 botocore-1.23.54 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681175.150875/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220212155255151729-2246'
 createTime: '2022-02-12T15:53:00.999968Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-12_07_53_00-12184437270919248059'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0212150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-12T15:53:00.999968Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-12_07_53_00-12184437270919248059]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-12_07_53_00-12184437270919248059
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-12_07_53_00-12184437270919248059?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-12_07_53_00-12184437270919248059 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:05.575Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.523Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.574Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.634Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.664Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.694Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.717Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.742Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.829Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.852Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.876Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.904Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:07.961Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.085Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.117Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.150Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.226Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.283Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.300Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:08.329Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:34.249Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:53:51.741Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:54:15.766Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T15:54:15.792Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-12_07_53_00-12184437270919248059 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 26cf1661e27d49239de9c706a5e22eae and timestamp: 1644681974.7623534:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 95
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0212150512.1644681979.088836/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220212160619089732-1117'
 createTime: '2022-02-12T16:06:25.713324Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-12_08_06_25-17020525367473321733'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0212150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-12T16:06:25.713324Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-12_08_06_25-17020525367473321733]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-12_08_06_25-17020525367473321733
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-12_08_06_25-17020525367473321733?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-12_08_06_25-17020525367473321733 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:33.415Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.172Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.207Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.275Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.336Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.365Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.422Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.496Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.528Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.562Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.589Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.621Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.648Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.673Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.702Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.735Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:34.988Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.072Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.103Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.133Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.157Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.205Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.238Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:35.292Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:06:54.369Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:07:16.228Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:07:39.168Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-12T16:07:39.218Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-12_08_06_25-17020525367473321733 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e445df6e2a8d4f04bc7ea61e7287f135 and timestamp: 1644682766.9081693:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 85
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e445df6e2a8d4f04bc7ea61e7287f135 and timestamp: 1644682766.9081693:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 85
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_ba67e76f-afcb-4cc0-88af-654cffc807ac_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-12_07_53_00-12184437270919248059?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-12_08_06_25-17020525367473321733?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 5s
92 actionable tasks: 64 executed, 26 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wl7ysperz53mo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #613

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/613/display/redirect?page=changes>

Changes:

[Carl Yeksigian] Cache bucket matcher regex in GcsPath

[benjamin.gonzalez] [BEAM-12672] Retry flaky tests

[benjamin.gonzalez] [BEAM-12672] Fix spotlessApply

[laraschmidt] Fixing the log line to properly handle a large allowed skew.

[n] BEAM-13159 Update embedded-redis dependency

[n] address comments

[noreply] Minor: Add 2.38.0 section to CHANGES.md (#16804)

[noreply] [BEAM-12000] Fix typo in portable Python job definition (#16812)

[noreply] [BEAM-12164]: Fixes SpannerChangeStreamIT (#16806)

[noreply] [BEAM-12572] Fix failures in python examples tests (#16781)

[noreply] [BEAM-13921] filter out debeziumIO test for spark runner (#16815)

[noreply] [BEAM-13855] Skip SpannerChangeStreamOrderedWithinKeyIT and

[noreply] [BEAM-13679] playground - move quick start category to the top (#16808)

[noreply] Update license_script.sh (#16789)

[noreply] [BEAM-13908] [Coverage] Better testing coverage for gcpopts (#16816)

[noreply] Merge pull request #16809 from [BEAM-12164] Added integration test for

[noreply] [BEAM-4032]Support staging binary distributions of dependency packages


------------------------------------------
[...truncated 56.78 KB...]
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2659546 sha256=a0a3bf28b109a251ef99e438557a2cf6f14290bb7c3a502a4e15b4c4002e5e8c
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0

> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.20.53 botocore-1.23.53 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/dataflow-****.jar in 8 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644603301.451277/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220211181501452195-4040'
 createTime: '2022-02-11T18:15:10.854393Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-11_10_15_09-1539623431457021013'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0211065727'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-11T18:15:10.854393Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-11_10_15_09-1539623431457021013]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-11_10_15_09-1539623431457021013
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-11_10_15_09-1539623431457021013?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-11_10_15_09-1539623431457021013 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:16.994Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:17.972Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.022Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.090Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.126Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.177Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.211Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.256Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.296Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.334Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.367Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.402Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.500Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.532Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.692Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.731Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.769Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.805Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.861Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.901Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:18.938Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:15:37.813Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:16:19.830Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:16:44.284Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:16:44.316Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-11_10_15_09-1539623431457021013 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: bb49dd9eb04646f8932cc2b9d471d7a9 and timestamp: 1644604083.439891:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 90
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0211065727.1644604087.795395/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220211182807796988-9424'
 createTime: '2022-02-11T18:28:15.623747Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-11_10_28_14-52995843861356284'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0211065727'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-11T18:28:15.623747Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-11_10_28_14-52995843861356284]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-11_10_28_14-52995843861356284
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-11_10_28_14-52995843861356284?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-11_10_28_14-52995843861356284 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:23.040Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.642Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.676Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.745Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.818Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.848Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.919Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:25.964Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.005Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.033Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.068Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.092Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.127Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.160Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.188Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.246Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.314Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.339Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.365Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.402Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.442Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.478Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.512Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.548Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.624Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.662Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:26.705Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:28:32.702Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:29:13.310Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:29:37.119Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-11T18:29:37.167Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-11_10_28_14-52995843861356284 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_1b91f667-211d-492c-9ce0-e5542d4e707a_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-11_10_15_09-1539623431457021013?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-11_10_28_14-52995843861356284?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_1b91f667-211d-492c-9ce0-e5542d4e707a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 52s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uixx3efiajsdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #612

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/612/display/redirect?page=changes>

Changes:

[david.prieto.rivera] Missing contribution

[noreply] [BEAM-13803] Add support for native iterable side inputs to the Go SDK

[noreply] [BEAM-11095] Better error handling for illegal emit functions (#16776)

[noreply] Merge pull request #16613 from Supporting JdbcIO driver in classpath for

[noreply] Merge pull request #15848 from [BEAM-13835] An any-type implementation

[Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators.

[Valentyn Tymofieiev] Add a container for Python 3.9.

[Valentyn Tymofieiev] Allow job submission with Python 3.9 on Dataflow runner

[Valentyn Tymofieiev] Add Python 3.9 test suites. Keep Dataflow V1 suites unchanged for now.

[Valentyn Tymofieiev] Add py3.9 Github actions suites.

[Valentyn Tymofieiev] Py39 Doc updates.

[Valentyn Tymofieiev] [BEAM-9980] Simplify run_validates_container.sh to avoid branching.

[Valentyn Tymofieiev] Update Cython to a new version that has py39 wheels.

[Valentyn Tymofieiev] [BEAM-13845] Fix comparison with potentially incomparable default

[Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators.

[Valentyn Tymofieiev] Mark Python 3.9 as supported version.

[noreply] [release-2.36.0][website] Fix github release notes script, header for

[noreply] Use shell to run python for setupVirtualenv (#16796)

[Daniel Oliveira] [BEAM-13830] Properly shut down Debezium expansion service in IT script.

[noreply] Merge pull request #16659 from [BEAM-13774][Playground] Add user to

[Valentyn Tymofieiev] [BEAM-13868] Remove gsutil dep from hdfs IT test.

[noreply] [BEAM-13776][Playground] (#16731)

[noreply] [BEAM-13867] Drop NaNs returned by nlargest in flight_delays example

[noreply] Announce Python 3.9 in CHANGES.md (#16802)

[Brian Hulette] Moving to 2.38.0-SNAPSHOT on master branch.

[noreply] [BEAM-11095] Better error handling for iter/reiter/multimap (#16794)


------------------------------------------
[...truncated 56.53 KB...]
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2656759 sha256=e2188b7e166e11d5873475c9ed8cc54140795d85abf42924323c91c708eff845
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.1 azure-storage-blob-12.9.0 boto3-1.20.52 botocore-1.23.52 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.1 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644508235.230461/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220210155035231390-7108'
 createTime: '2022-02-10T15:50:41.782053Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-10_07_50_40-13860593473381725575'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0210150455'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-10T15:50:41.782053Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-10_07_50_40-13860593473381725575]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-10_07_50_40-13860593473381725575
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_07_50_40-13860593473381725575?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-10_07_50_40-13860593473381725575 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:49.233Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:49.805Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:49.861Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:49.930Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:49.968Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.006Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.044Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.077Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.118Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.187Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.221Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.251Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.320Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.501Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.532Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.563Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.596Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.654Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.697Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:50:50.735Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:51:23.706Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:51:37.758Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:52:00.293Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T15:52:00.324Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-10_07_50_40-13860593473381725575 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 59f97b3e4756495784ba1d97ce082499 and timestamp: 1644509014.4097326:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0210150455.1644509019.057780/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220210160339058671-5885'
 createTime: '2022-02-10T16:03:46.200448Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-10_08_03_45-8494233491884788239'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0210150455'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-10T16:03:46.200448Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-10_08_03_45-8494233491884788239]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-10_08_03_45-8494233491884788239
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_08_03_45-8494233491884788239?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-10_08_03_45-8494233491884788239 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:53.058Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:54.848Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:54.881Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:54.945Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.027Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.046Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.101Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.148Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.179Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.207Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.312Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.344Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.378Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.434Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.467Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.557Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.612Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.635Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.657Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.678Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.722Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.756Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:03:55.786Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:04:04.970Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:04:41.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:05:06.760Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-10T16:05:06.791Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-10_08_03_45-8494233491884788239 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_8f99a457-bd20-473b-8237-22b6ee8777e2_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_07_50_40-13860593473381725575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_08_03_45-8494233491884788239?project=apache-beam-testing
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_8f99a457-bd20-473b-8237-22b6ee8777e2_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 55s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/a74tyo2tf3t6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #611

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/611/display/redirect?page=changes>

Changes:

[noreply] Update README.md

[marco.robles] Update README with latest PreCommit Jobs

[marco.robles] Update Postcommit jobs with latest jobs

[marco.robles] Update Performace job tests in readme

[marco.robles] update load job tests with latest updates

[marco.robles] update other jobs test with latest updates

[marco.robles] mismatch links fix

[marco.robles] update trigger phrase for some postCommit jobs

[marco.robles] correct trigger phrases in readme

[marco.robles] add pending jobs to readme

[noreply] Update README.md

[mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk

[Pablo Estrada] Output successful rows from BQ Streaming Inserts

[schapman] BEAM-13439 Type annotation for ptransform_fn

[mmack] [adhoc] Remove remaining usage of Powermock from aws2.

[marco.robles] fix broken links in jobs & remove the invalid ones

[Kyle Weaver] Update Dataflow Python dev container images.

[Kiley Sok] Add java 17 to changes

[noreply] [BEAM-12914] Add missing 3.9 opcodes to type inference. (#16761)

[noreply] [BEAM-13321] Initial BigQueryIO externalization. (#16489)

[noreply] [BEAM-13193] Enable process bundle response elements embedding in Java

[noreply] [BEAM-13830] added a debeziumio_expansion_addr flag to GoSDK (#16780)

[noreply] Apply spotless. (#16783)

[Daniel Oliveira] [BEAM-13732] Switch x-lang BigQueryIO expansion service to GCP one.

[noreply] [BEAM-13858] Fix broken github action on :sdks:go:examples:wordCount

[Kiley Sok] add jira for runner v2

[noreply] [BEAM-13732] Go SDK BigQuery IO wrapper. Initial implementation.

[noreply] [BEAM-13732] Add example for Go BigQuery IO wrapper. (#16786)

[noreply] Update CHANGES.md with Go SDK milestones. (#16787)

[noreply] [BEAM-13193] Allow BeamFnDataOutboundObserver to flush elements.


------------------------------------------
[...truncated 56.46 KB...]
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2655298 sha256=b6ad76eada396525d5fe07b561f3688af7e9d2fbe140bd2d30dab830624b46a1
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.51 botocore-1.23.51 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644421831.534530/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220209155031535526-2874'
 createTime: '2022-02-09T15:50:38.016135Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-09_07_50_37-13124903489822438399'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0209151035'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-09T15:50:38.016135Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-09_07_50_37-13124903489822438399]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-09_07_50_37-13124903489822438399
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_07_50_37-13124903489822438399?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-09_07_50_37-13124903489822438399 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:45.873Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:48.851Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:48.883Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:48.949Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:48.978Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.018Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.058Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.090Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.132Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.166Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.198Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.230Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.283Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.316Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.389Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.500Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.521Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.552Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.576Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.641Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.664Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:50:49.715Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:51:22.637Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:51:34.133Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:52:03.399Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T15:52:03.434Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-09_07_50_37-13124903489822438399 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: a17c8391be204773946393ae1032d103 and timestamp: 1644422615.611977:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0209151035.1644422620.669379/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220209160340670260-6302'
 createTime: '2022-02-09T16:03:48.426492Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-09_08_03_47-3116331457409957914'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0209151035'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-09T16:03:48.426492Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-09_08_03_47-3116331457409957914]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-09_08_03_47-3116331457409957914
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_08_03_47-3116331457409957914?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-09_08_03_47-3116331457409957914 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:03:56.220Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.708Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.740Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.802Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.900Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.928Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:00.996Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.062Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.091Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.117Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.150Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.203Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.225Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.344Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.381Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.426Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.471Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.503Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.534Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.581Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.644Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.688Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.718Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.772Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.817Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:01.854Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:14.827Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:04:46.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:05:12.641Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-09T16:05:12.666Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-09_08_03_47-3116331457409957914 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_9a8d19a4-fa4b-43cf-af11-c4ebb93f2ffd_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_07_50_37-13124903489822438399?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_08_03_47-3116331457409957914?project=apache-beam-testing
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_9a8d19a4-fa4b-43cf-af11-c4ebb93f2ffd_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 48s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tnfcjkdyfh5qs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #610

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/610/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12976] Log projection pushdown optimizations.

[benjamin.gonzalez] [BEAM-12572] Change jobs to run as cron jobs

[Ismaël Mejía] [BEAM-13839] Upgrade zstd-jni to version 1.5.2-1

[mmack] [BEAM-13840] Fix usage of legacy rawtypes in AWS modules

[alexander.zhuravlev] [BEAM-13820] Changed color of delete icon in pipeline options dropdown,

[noreply] [BEAM-11971] Revert "Fix timer consistency in direct runner" (#16748)

[noreply] [BEAM-13193] Aggregates fn api outbound data/timers of different

[noreply] [BEAM-13767] Migrate a bundle of grade tasks to use configuration

[noreply] Merge pull request #16653 from [BEAM-12164]: Add integration tests for

[noreply] Merge pull request #16728 from [BEAM-13823] Update docs for SnowflakeIO

[noreply] Merge pull request #16660 from [BEAM-13771][Playground] Send multifile

[noreply] Merge pull request #16646 from [BEAM-13643][Playground] Setup running

[noreply] [BEAM-13015] Add state caching benchmark and move benchmarks to their

[noreply] [BEAM-13419] Check for initialization in dataflow runner (#16765)

[noreply] Merge pull request #16701 from [BEAM-13786] [Playground] [Bugfix] Update

[noreply] Merge pull request #16754 from [BEAM-13838][Playground] Add logs in case

[noreply] [BEAM-13293] consistent naming for expansion service address and flag

[noreply] Merge pull request #16700 from [BEAM-13790][Playground] Change logic of

[noreply] [BEAM-13830] update dependency for debeziumio expansion service (#16743)

[noreply] [BEAM-13761] consistent namings for expansion address in Debezium IO

[noreply] [BEAM-13806] Shutting down SchemaIO expansion services from Go VR

[noreply] [release-2.36.0] Update website/changelog for release 2.36.0 (#16627)

[noreply] [BEAM-13848] Update numpy intersphinx link (#16767)

[noreply] [release-23.6.0] Fix JIRA link for 2.36 blog (#16771)

[noreply] [BEAM-13647] Use role for Go worker binary. (#16729)

[noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751)


------------------------------------------
[...truncated 56.50 KB...]
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2654682 sha256=bcfef422a6a9f2ed2407f79f0b5a34db500de5a5ab59f6fbab2ae1504f6c94f7
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.50 botocore-1.23.50 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.2.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.0 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644335438.286915/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220208155038287864-7490'
 createTime: '2022-02-08T15:50:44.879824Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-08_07_50_44-6452877551075740486'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0208150931'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-08T15:50:44.879824Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-08_07_50_44-6452877551075740486]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-08_07_50_44-6452877551075740486
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-08_07_50_44-6452877551075740486?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-08_07_50_44-6452877551075740486 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:52.350Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.555Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.588Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.656Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.702Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.735Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.768Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.801Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.871Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.929Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.963Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:53.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.029Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.063Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.092Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.202Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.233Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.270Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.304Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.351Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.376Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:50:54.424Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:51:28.236Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:51:39.065Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:52:02.907Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T15:52:02.958Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-08_07_50_44-6452877551075740486 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 91fb2d2b18004996a1820816f7e1a202 and timestamp: 1644336212.4496534:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0208150931.1644336217.140675/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220208160337141593-6282'
 createTime: '2022-02-08T16:03:44.707323Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-08_08_03_43-3057433617570904052'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0208150931'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-08T16:03:44.707323Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-08_08_03_43-3057433617570904052]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-08_08_03_43-3057433617570904052
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-08_08_03_43-3057433617570904052?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-08_08_03_43-3057433617570904052 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:53.104Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.057Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.090Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.150Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.218Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.248Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.330Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.440Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.523Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.614Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.656Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.708Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.731Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.776Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.800Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.854Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.933Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.955Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:03:59.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.020Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.047Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.074Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.095Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.117Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.161Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.186Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:00.477Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:17.185Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:04:44.993Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:05:09.959Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-08T16:05:09.990Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-08_08_03_43-3057433617570904052 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_5e597e14-43cd-4f44-bbd2-2c9cbbb9a317_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-08_07_50_44-6452877551075740486?project=apache-beam-testing
    self.result = self.pipeline.run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-08_08_03_43-3057433617570904052?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 519, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_5e597e14-43cd-4f44-bbd2-2c9cbbb9a317_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 38s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h67l3wqvsmw3s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #609

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/609/display/redirect>

Changes:


------------------------------------------
[...truncated 57.22 KB...]
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2654153 sha256=511b86e1e45d187fc33e808ce31f5f0d21145b9c63c55c3ee267e7bb9f495616
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.49 botocore-1.23.49 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644251672.204184/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220207163432205849-6106'
 createTime: '2022-02-07T16:34:42.455057Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-07_08_34_39-6314020205176262207'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0207154543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-07T16:34:42.455057Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-07_08_34_39-6314020205176262207]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-07_08_34_39-6314020205176262207
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_08_34_39-6314020205176262207?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-07_08_34_39-6314020205176262207 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:51.537Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.289Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.317Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.386Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.433Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.480Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.524Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.599Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.657Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.694Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.721Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.743Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.834Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.865Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:52.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.044Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.072Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.107Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.128Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.183Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.204Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:34:53.361Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:35:10.984Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:35:34.427Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:36:01.501Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:36:01.537Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-07_08_34_39-6314020205176262207 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: be2bd5071f8344048c09fe11a82a45d5 and timestamp: 1644252450.216415:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0207154543.1644252457.221067/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220207164737223903-4327'
 createTime: '2022-02-07T16:47:45.812546Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-07_08_47_45-6609656032174822108'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0207154543'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-07T16:47:45.812546Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-07_08_47_45-6609656032174822108]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-07_08_47_45-6609656032174822108
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_08_47_45-6609656032174822108?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-07_08_47_45-6609656032174822108 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:53.884Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:55.823Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:55.870Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:55.957Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.031Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.064Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.145Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.216Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.281Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.317Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.349Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.381Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.413Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.444Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.468Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.501Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.542Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.575Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.640Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.702Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.829Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.859Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.892Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.930Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:56.984Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:57.015Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:47:57.049Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:48:31.473Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:48:41.855Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:49:07.066Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T16:49:07.100Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-07_08_47_45-6609656032174822108 after 600 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_9dfe33c7-d34f-4bbf-b8f1-525c7aba0b45_read_matcher.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_08_34_39-6314020205176262207?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_08_47_45-6609656032174822108?project=apache-beam-testing
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_9dfe33c7-d34f-4bbf-b8f1-525c7aba0b45_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 38s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tzb3s5j4fel3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #608

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/608/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16726 from [BEAM-12164]: Parses change streams

[mmack] [BEAM-13147] Avoid nullness issue during init of AwsModule (AWS Sdk v2)


------------------------------------------
[...truncated 56.22 KB...]
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2654153 sha256=d757fcc39ece87ff62852f5ba83520bfac9932921702a995b16764a93b58a29d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.49 botocore-1.23.49 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:io:google-cloud-platform:classes
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644245341.085708/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220207144901086712-5405'
 createTime: '2022-02-07T14:49:10.856925Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-07_06_49_07-14416225496842764159'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0206065713'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-07T14:49:10.856925Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-07_06_49_07-14416225496842764159]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-07_06_49_07-14416225496842764159
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_06_49_07-14416225496842764159?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-07_06_49_07-14416225496842764159 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:16.570Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.535Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.561Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.652Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.689Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.721Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.763Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.796Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.838Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.875Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.913Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.946Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:17.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.029Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.064Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.158Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.191Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.222Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.245Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.303Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.346Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:18.395Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:49:53.063Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:50:00.148Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:50:00.181Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:50:20.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:50:23.350Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T14:50:23.396Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-07_06_49_07-14416225496842764159 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 77346f12632a462ba3b9b31d5c90550d and timestamp: 1644246115.1524196:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 76
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0206065713.1644246119.079416/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220207150159080321-9323'
 createTime: '2022-02-07T15:02:07.067289Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-07_07_02_06-2773256163943776152'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0206065713'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-07T15:02:07.067289Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-07_07_02_06-2773256163943776152]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-07_07_02_06-2773256163943776152
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_07_02_06-2773256163943776152?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-07_07_02_06-2773256163943776152 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:14.885Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:15.848Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:15.878Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:15.949Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.044Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.076Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.171Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.244Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.288Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.348Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.382Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.448Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.516Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.556Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.654Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.733Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.778Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.810Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.842Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.875Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.940Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:16.978Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:17.012Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:02:39.005Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:03:02.580Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:03:30.313Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-07T15:03:30.345Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-07_07_02_06-2773256163943776152 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5142beeb27b142a9a2ee66552d764a94 and timestamp: 1644246910.988459:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 98
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5142beeb27b142a9a2ee66552d764a94 and timestamp: 1644246910.988459:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 98
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_18e2f06d-030c-4d50-b7a7-9d0b092d51b0_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_06_49_07-14416225496842764159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-07_07_02_06-2773256163943776152?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 26s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qiagu7lmvc4d4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #607

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/607/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13828] Fix stale bot (#16734)

[noreply] Merge pull request #16364 from [BEAM-13182]  Add diagrams to backend

[noreply] [BEAM-13811] Fix save_main_session arg in tests examples (#16709)

[Kiley Sok] Update beam-master version

[noreply] [BEAM-13015] Calculate exception for closing BeamFnDataInboundObserver2

[noreply] Minor doc tweaks for validating vendoring. (#16747)

[noreply] [BEAM-13686] OOM while logging a large pipeline even when logging level

[noreply] [BEAM-13629] Update URL artifact type for Dataflow Go (#16490)

[noreply] [BEAM-13832] Add automated expansion service start-up to JDBCio (#16739)

[noreply] [BEAM-13831] Add automated expansion service infra into Debezium Read()

[noreply] [BEAM-13821] Add automated expansion service start-up to KafkaIO

[noreply] [BEAM-13799] Created a Dataproc cluster manager for Interactive Beam

[noreply] Merge pull request #16727: [BEAM-11971] remove unsafe Concurrent data


------------------------------------------
[...truncated 55.33 KB...]
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.11-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2654153 sha256=ebc96c1abc4c8ed1e056ecd9f05bbb9fb5c2fda7663325aab055097d41cf4d48
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.49 botocore-1.23.49 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644076238.523205/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220205155038524884-4545'
 createTime: '2022-02-05T15:50:46.078190Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-05_07_50_44-7003481814432688537'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0205150454'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-05T15:50:46.078190Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-05_07_50_44-7003481814432688537]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-05_07_50_44-7003481814432688537
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_07_50_44-7003481814432688537?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-05_07_50_44-7003481814432688537 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:53.723Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.404Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.439Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.507Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.534Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.576Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.600Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.654Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.692Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.719Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.747Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.782Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.838Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:54.996Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.040Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.095Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.127Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.182Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.216Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:50:55.265Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:51:32.268Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:51:39.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:52:04.201Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T15:52:04.235Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-05_07_50_44-7003481814432688537 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: cfc5863c5cfb44e68611175e2ed78080 and timestamp: 1644077024.1623921:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0205150454.1644077029.312100/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220205160349313115-1410'
 createTime: '2022-02-05T16:03:57.106577Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-05_08_03_56-3605392844664302804'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0205150454'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-05T16:03:57.106577Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-05_08_03_56-3605392844664302804]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-05_08_03_56-3605392844664302804
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_08_03_56-3605392844664302804?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-05_08_03_56-3605392844664302804 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:04.801Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:10.944Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:10.981Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.048Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.108Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.141Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.212Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.288Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.325Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.361Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.394Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.520Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.583Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.625Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.798Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.839Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.875Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.908Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.939Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:11.995Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:12.029Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:12.063Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:23.560Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:04:57.104Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:05:22.176Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-05T16:05:22.228Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-05_08_03_56-3605392844664302804 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4debc68d6dff47ad9ca00376577cef9c and timestamp: 1644077811.9269166:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4debc68d6dff47ad9ca00376577cef9c and timestamp: 1644077811.9269166:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 101
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_07_50_44-7003481814432688537?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_ecedfbb9-d71b-44b3-983e-bb2a08781d05_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_08_03_56-3605392844664302804?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 28s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bynvnkeexe2dk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #606

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/606/display/redirect?page=changes>

Changes:

[Kiley Sok] Allow Java 17 to be used in SDK

[Kiley Sok] add testing support

[Kiley Sok] Add more testing support for java 17

[Kiley Sok] workaround for jamm

[Kiley Sok] Add Jenkins test for Java 17

[Kiley Sok] Fix jvm hex and skip errorprone

[Kiley Sok] Fix display data for anonymous classes

[Kiley Sok] fix jpms tests

[Kiley Sok] skip zetasql

[Kiley Sok] spotless

[Kiley Sok] spotless

[Kiley Sok] Fix trigger

[Kiley Sok] skip checker framework

[Kiley Sok] fix app name

[Kiley Sok] remove duplicate property check

[Heejong Lee] [BEAM-13813] Add support for URL artifact to extractStagingToPath

[mmack] [BEAM-13663] Remove unused duplicate option for AWS client configuration

[avilovpavel6] Remove Python SQL Test example from catalog

[relax] Fix timer consistency in direct runner

[noreply] [BEAM-13757] adds pane observation in DoFn (#16629)

[Jan Lukavský] Change links to Books from Amazon to Publisher

[noreply] [BEAM-13605] Add support for pandas 1.4.0 (#16590)

[noreply] [BEAM-13761] adds Debezium IO wrapper for Go SDK (#16642)

[noreply] [BEAM-13024] Unify PipelineOptions behavior (#16719)

[noreply] Update sdks/go/pkg/beam/artifact/materialize_test.go

[noreply] Merge pull request #16605 from [BEAM-13634][Playground] Create a

[noreply] Merge pull request #16593 from [BEAM-13725][Playground] Add graph to the

[noreply] Merge pull request #16699 from [BEAM-13789][Playground] Change logic of

[alexander.chermenin] Fixed CSS for Case study page

[mmack] [BEAM-13203] Deprecate SnsIO.writeAsync for AWS Sdk v2 due to risk of


------------------------------------------
[...truncated 55.88 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2654126 sha256=507255b9a66b508c73c886ac0a170e405fd424a9690c7f2fc359ba7818c2dd3c
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.22.0 azure-storage-blob-12.9.0 boto3-1.20.48 botocore-1.23.48 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643989840.373719/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220204155040374706-4263'
 createTime: '2022-02-04T15:50:47.313712Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-04_07_50_46-7117274574648056503'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0204150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-04T15:50:47.313712Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-04_07_50_46-7117274574648056503]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-04_07_50_46-7117274574648056503
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-04_07_50_46-7117274574648056503?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-04_07_50_46-7117274574648056503 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:54.963Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.762Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.792Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.861Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.899Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.927Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.956Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:55.984Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.039Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.092Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.123Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.245Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.403Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.438Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.465Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.509Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.561Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.591Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:50:56.640Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:51:30.861Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:51:39.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:52:03.226Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T15:52:03.256Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-04_07_50_46-7117274574648056503 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8117b1b49dd94f089fe073c530d9c3a4 and timestamp: 1643990675.353889:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0204150510.1643990679.112426/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220204160439113377-8911'
 createTime: '2022-02-04T16:04:46.278609Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-04_08_04_45-2486097846350970183'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0204150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-04T16:04:46.278609Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-04_08_04_45-2486097846350970183]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-04_08_04_45-2486097846350970183
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-04_08_04_45-2486097846350970183?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-04_08_04_45-2486097846350970183 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:54.119Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.025Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.053Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.118Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.194Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.223Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.300Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.362Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.403Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.428Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.560Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.625Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.696Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.795Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.837Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.869Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.898Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.931Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:55.966Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:56.022Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:56.056Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:04:56.079Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:05:05.064Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:05:40.883Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:06:06.520Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-04T16:06:06.555Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-04_08_04_45-2486097846350970183 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3feaf4f45cf41dfa2b834b7b68cc78c and timestamp: 1643991490.0113404:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 204
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f3feaf4f45cf41dfa2b834b7b68cc78c and timestamp: 1643991490.0113404:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 204
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-04_07_50_46-7117274574648056503?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-04_08_04_45-2486097846350970183?project=apache-beam-testing
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_35f70355-0276-48ec-8ca8-3942fc18a5f5_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 49s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cplv3jkpyohwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #605

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/605/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13737][Playground]

[Robert Bradshaw] Make num-stages counter into an internal counter.

[Kenneth Knowles] [BEAM-13768] Fix NullPointerException in BigQueryStorageSourceBase

[Robert Bradshaw] Avoid packaging avro in the java harness jar.

[noreply] [BEAM-4665] Allow joining a running dataflow pipeline without throwing

[noreply] [BEAM-13801] Add standard coder tests for state backed iterable.

[noreply] [BEAM-13430]  Fix provided configuration by removing extendsFrom for

[noreply] [BEAM-12830] Print clearer go version fail message (#16693)

[Jan Lukavský] Add reference to Books to Learning Resources in website

[noreply] Use ThreadLocal for DESERIALIZATION_CONTEXT (#16680)

[noreply] Minor: Add apt update after adding deadsnakes repository in dev env

[noreply] [BEAM-13807] Regenerate container images to get TF 2.8.0 (#16707)

[noreply] [BEAM-13399, BEAM-13683] Eagerly materialize artifacts for automated

[noreply] Merge pull request #16617 from [BEAM-13743] [Playground] Add context

[noreply] Merge pull request #16618 from [BEAM-13744] [Playground] Add context

[noreply] Merge pull request #16698 from [BEAM-13802][Playground] [Bugfix] Clean

[noreply] [BEAM-13293][BEAM-13806] Pipe a SchemaIO flag through Go integration

[noreply] [BEAM-13605] Modify groupby.apply implementation in preparation for

[noreply] Merge pull request #16436 from [BEAM-1330] - DatastoreIO Writes should


------------------------------------------
[...truncated 55.90 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.11-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2653688 sha256=b3dfb7a5fae082c188608330e2781460203f28b004a8e80ae23c7ab4e28f9a36
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.47 botocore-1.23.47 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.1 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643906263.646100/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220203163743647046-4525'
 createTime: '2022-02-03T16:37:50.485573Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-03_08_37_49-11166389455807522858'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0203163614'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-03T16:37:50.485573Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-03_08_37_49-11166389455807522858]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-03_08_37_49-11166389455807522858
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-03_08_37_49-11166389455807522858?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-03_08_37_49-11166389455807522858 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:58.364Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.402Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.437Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.507Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.540Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.567Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.594Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.632Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.683Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.712Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.736Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.765Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.855Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:37:59.890Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.032Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.068Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.105Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.162Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.211Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:00.272Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:29.216Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:38:47.167Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:39:10.997Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:39:11.019Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-03_08_37_49-11166389455807522858 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 68c9d77dd3e94c4790ac5d6c56b76adc and timestamp: 1643907037.668171:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 116
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0203163614.1643907043.244371/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220203165043245315-6616'
 createTime: '2022-02-03T16:50:50.002316Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-03_08_50_49-4975392867228711169'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0203163614'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-03T16:50:50.002316Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-03_08_50_49-4975392867228711169]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-03_08_50_49-4975392867228711169
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-03_08_50_49-4975392867228711169?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-03_08_50_49-4975392867228711169 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:50:57.432Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.133Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.165Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.222Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.280Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.299Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.363Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.445Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.475Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.647Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.697Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.723Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.789Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.811Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.842Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.874Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.914Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.952Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:00.979Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:01Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:01.046Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:01.069Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:01.098Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:24.292Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:51:46.952Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:52:11.633Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-03T16:52:11.659Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-03_08_50_49-4975392867228711169 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d4aa3db0f9be4a86b7f3b922ca4e8618 and timestamp: 1643907824.9409704:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 91
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d4aa3db0f9be4a86b7f3b922ca4e8618 and timestamp: 1643907824.9409704:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 91
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-03_08_37_49-11166389455807522858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-03_08_50_49-4975392867228711169?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_adf19415-d0a6-47de-a279-1c665322a82a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 18s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zj5x3juqzllwu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #604

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/604/display/redirect?page=changes>

Changes:

[career] [BEAM-13734] Support cache directories that use GCS buckets

[noreply] Merge pull request #16655 from [BEAM-12164]: Add retry protection to

[noreply] Merge pull request #16586 from [BEAM-13731] FhirIO: Add support for

[noreply] [BEAM-13011] Adds a link to Multi-language Pipelines Tips wiki page

[noreply] [BEAM-12572] Run python examples on multiple runners (#16154)

[noreply] [BEAM-13574] Large Wordcount (#16455)

[noreply] [BEAM-13293] Refactor JDBC IO Go Wrapper (#16686)

[noreply] Edit license script for Java, add manual licenses for xz (#16692)

[mmack] [BEAM-13563] Restructure Kinesis Source for Aws 2 internally to prepare


------------------------------------------
[...truncated 56.83 KB...]
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2652332 sha256=5efe03b1712387f73085d4c47464ac6eed7be22056ebb1c326232720bb318999
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.46 botocore-1.23.46 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817035.121216/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220202155035122177-2322'
 createTime: '2022-02-02T15:50:42.032477Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-02_07_50_41-17274699251788017136'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0202153624'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-02T15:50:42.032477Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-02_07_50_41-17274699251788017136]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-02_07_50_41-17274699251788017136
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-02_07_50_41-17274699251788017136?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-02_07_50_41-17274699251788017136 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:48.969Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.623Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.648Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.699Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.739Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.778Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.812Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.836Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.876Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.903Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.924Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.957Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:49.988Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.081Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.182Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.230Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.281Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.353Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.414Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.450Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:50:50.485Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:09.834Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:31.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:31.195Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:41.496Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:55.939Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T15:51:55.963Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-02_07_50_41-17274699251788017136 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c629dc3f13a74e7d890bcc99fec505f0 and timestamp: 1643817814.7557771:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0202153624.1643817818.332186/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220202160338333088-2061'
 createTime: '2022-02-02T16:03:45.618896Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-02_08_03_44-11536175496035715207'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0202153624'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-02T16:03:45.618896Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-02_08_03_44-11536175496035715207]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-02_08_03_44-11536175496035715207
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-02_08_03_44-11536175496035715207?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-02_08_03_44-11536175496035715207 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:52.263Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.337Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.372Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.468Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.561Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.585Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.652Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.725Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.770Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.826Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.866Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.898Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:53.973Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.005Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.040Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.209Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.229Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.275Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.311Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.331Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.355Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.411Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.467Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:03:54.497Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:04:09.831Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:04:34.913Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:05:00.367Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-02T16:05:00.410Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-02_08_03_44-11536175496035715207 after 600 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_fab13ae8-64ef-4230-80ef-cb79771a4724_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-02_07_50_41-17274699251788017136?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-02_08_03_44-11536175496035715207?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_fab13ae8-64ef-4230-80ef-cb79771a4724_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 23s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lmex4ppdsz5za

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #603

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/603/display/redirect?page=changes>

Changes:

[daria.malkova] Support SCIO SDK via sbt projects

[samuelw] [BEAM-11648] Share thread pool across RetryManager instances.

[Pablo Estrada] Exclude per-key order tests on Twister2 runner

[Heejong Lee] Fix Java SDK container image name for load-tests and nexmark

[daria.malkova] Change executable name fo go tests

[avilovpavel6] Fix java test

[noreply] [BEAM-13769] Skip test_main_session_not_staged_when_using_cloudpickle

[noreply] [BEAM-6744] Support implicitly setting project id in Go Dataflow runner

[noreply] Merge pull request #16493 from [BEAM-13632][Playground] Save catalog

[noreply] Exclude jul-to-slf4j from Spark runner in quickstart POM templates

[noreply] [BEAM-11936] Enable a few errorprone checks that were broken by pinned

[noreply] [BEAM-13780] Add CONTRIBUTING.md pointing to main guide (#16666)

[noreply] [BEAM-13777] Accept cache capacity as input parameter instead of default

[noreply] [BEAM-13051][A] Enable pylint warnings

[noreply] [BEAM-13779] Fix pr labeling (#16665)

[noreply] Merge pull request #16581 from [BEAM-12164]: Add

[noreply] Fix labeler trigger (#16674)

[noreply] [BEAM-13781] Exclude grpc-netty-shaded from gax-grpc's dependency

[noreply] [BEAM-13051] Fixed pylint warnings : raising-non-exception (E0710),

[noreply] [BEAM-13740] Correctly install go before running tests (#16673)

[noreply] [BEAM-12830] Update local Docker env Go version. (#16670)

[noreply] [BEAM-13051][B] Enable pylint warnings

[noreply] [BEAM-13430] Revert Spark libraries in Spark runner to provided (#16675)

[noreply] [BEAM-12240] Add Java 17 support (#16568)

[noreply] [BEAM-13760] Add random component to default python dataflow job name


------------------------------------------
[...truncated 55.88 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2646201 sha256=95b6150eaf3091cdbb682cdf3c2b1ab29b2abd8dbd2d3d95ef337463f7b0cabf
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.46 botocore-1.23.46 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643730632.924351/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220201155032925321-8744'
 createTime: '2022-02-01T15:50:39.149467Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-01_07_50_38-10757601097435167073'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0201150657'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-01T15:50:39.149467Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-01_07_50_38-10757601097435167073]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-01_07_50_38-10757601097435167073
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-01_07_50_38-10757601097435167073?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-01_07_50_38-10757601097435167073 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:58.517Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.651Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.675Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.744Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.783Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.823Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.853Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.891Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.929Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.959Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:50:59.993Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.014Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.241Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.271Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.299Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.335Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.384Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.419Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:00.448Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:29.409Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:51:48.938Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:52:10.036Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T15:52:10.090Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-01_07_50_38-10757601097435167073 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 215d3804840243ed85a8f799ddda9747 and timestamp: 1643731426.5337248:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 106
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0201150657.1643731430.575287/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220201160350576145-9946'
 createTime: '2022-02-01T16:03:57.532549Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-02-01_08_03_56-6247412861784006422'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0201150657'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-02-01T16:03:57.532549Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-02-01_08_03_56-6247412861784006422]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-02-01_08_03_56-6247412861784006422
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-01_08_03_56-6247412861784006422?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-02-01_08_03_56-6247412861784006422 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:04.830Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.632Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.659Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.729Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.788Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.825Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.888Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.955Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:05.996Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.100Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.131Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.164Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.375Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.408Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.451Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.487Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.509Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.535Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.599Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.628Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:06.686Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:15.266Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:04:48.464Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:05:14.056Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-02-01T16:05:14.094Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-02-01_08_03_56-6247412861784006422 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e13c9b94eae44157bd46968ee9ee8849 and timestamp: 1643732243.2563095:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 111
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e13c9b94eae44157bd46968ee9ee8849 and timestamp: 1643732243.2563095:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 111
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_d3dc18e5-0458-4a33-b0ee-bd0168201dda_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-01_07_50_38-10757601097435167073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-01_08_03_56-6247412861784006422?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 2s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6tum5dy4v66i4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #602

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/602/display/redirect?page=changes>

Changes:

[mrudary] Generalize S3FileSystem to support multiple URI schemes.


------------------------------------------
[...truncated 56.83 KB...]
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645955 sha256=88ec65dfd2e87dd70737dc92f78c2314a75465f2d3028efa04851bf04e039eb0
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.46 botocore-1.23.46 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.11 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643644239.205920/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220131155039206887-6833'
 createTime: '2022-01-31T15:50:45.247515Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-31_07_50_44-5987394211307244592'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0131150522'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-31T15:50:45.247515Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-31_07_50_44-5987394211307244592]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-31_07_50_44-5987394211307244592
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-31_07_50_44-5987394211307244592?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-31_07_50_44-5987394211307244592 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:57.967Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.105Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.132Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.198Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.228Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.268Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.299Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.332Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.373Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.400Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.432Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.467Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.501Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.534Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.705Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.755Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.799Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.822Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.868Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.897Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:50:59.936Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:51:26.487Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:51:41.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:52:07.840Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T15:52:07.871Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-31_07_50_44-5987394211307244592 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: cf7f5f7110ba44ba924dda69b8ec76c0 and timestamp: 1643645022.502938:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 102
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: open /var/lib/influxdb/meta/meta.dbtmp: no space left on device
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0131150522.1643645027.922886/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220131160347923811-1887'
 createTime: '2022-01-31T16:03:53.972346Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-31_08_03_53-17129838909320009085'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0131150522'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-31T16:03:53.972346Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-31_08_03_53-17129838909320009085]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-31_08_03_53-17129838909320009085
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-31_08_03_53-17129838909320009085?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-31_08_03_53-17129838909320009085 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:00.643Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.508Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.539Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.605Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.673Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.703Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.769Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.838Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.879Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.907Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:01.992Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.118Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.184Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.212Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.289Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.322Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.402Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.437Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.462Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.498Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.553Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.589Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:02.620Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:37.400Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:04:44.271Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:05:10.363Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-31T16:05:10.414Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-31_08_03_53-17129838909320009085 after 603 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_245c0049-1e57-452f-b7d4-3e1d0c473b4b_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-31_07_50_44-5987394211307244592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-31_08_03_53-17129838909320009085?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_245c0049-1e57-452f-b7d4-3e1d0c473b4b_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 41s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/luszzd4tquwwe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #601

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/601/display/redirect>

Changes:


------------------------------------------
[...truncated 55.85 KB...]
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645956 sha256=48506367fcaa01797a178e90b5fc3ce377871833095eb07eabd3e5b12197f28c
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, mypy-extensions, docopt, crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, atomicwrites, typing-inspect, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, libcst, hdfs, grpc-google-iam-v1, google-api-core, docker, deprecation, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.46 botocore-1.23.46 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643557834.520123/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220130155034521054-5079'
 createTime: '2022-01-30T15:50:41.465659Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-30_07_50_40-18188055447499204771'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0130150451'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-30T15:50:41.465659Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-30_07_50_40-18188055447499204771]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-30_07_50_40-18188055447499204771
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-30_07_50_40-18188055447499204771?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-30_07_50_40-18188055447499204771 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:48.701Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.526Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.560Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.628Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.656Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.683Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.715Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.742Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.765Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.798Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.825Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.856Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.881Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.903Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:49.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.079Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.104Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.125Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.156Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.211Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.242Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:50:50.270Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:51:03.397Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:51:30.477Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:51:55.827Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T15:51:55.860Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-30_07_50_40-18188055447499204771 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 70e26cff28d3449199fd75a1a19a05d4 and timestamp: 1643558618.8310778:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 109
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0130150451.1643558621.204398/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220130160341205339-9866'
 createTime: '2022-01-30T16:03:48.437726Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-30_08_03_47-8814584250736368804'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0130150451'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-30T16:03:48.437726Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-30_08_03_47-8814584250736368804]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-30_08_03_47-8814584250736368804
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-30_08_03_47-8814584250736368804?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-30_08_03_47-8814584250736368804 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:55.819Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:56.872Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:56.894Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:56.947Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.013Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.042Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.107Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.172Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.210Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.237Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.311Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.343Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.498Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.544Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.612Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.633Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.698Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.716Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.748Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.790Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.857Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.889Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:03:57.919Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:04:29.517Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:04:38.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:05:04.065Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-30T16:05:04.088Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-30_08_03_47-8814584250736368804 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f9f3a88043e940c68e7391bcef84c54b and timestamp: 1643559415.9227707:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 102
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: f9f3a88043e940c68e7391bcef84c54b and timestamp: 1643559415.9227707:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 102
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_f5e8534b-7247-4997-b071-c7c928548cdd_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-30_07_50_40-18188055447499204771?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-30_08_03_47-8814584250736368804?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 35s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/h6d2iu2txw3o6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #600

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/600/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-13751] Don't block on gcloud when attempting to get default GCP

[Kyle Weaver] [BEAM-13751] Parameterize wait timeout so test doesn't waste 2s.

[Kyle Weaver] [BEAM-13751] Add comment explaining sleep.

[noreply] Update Python SDK beam-master tags (#16630)

[noreply] Merge pull request #16592 from [BEAM-13722][Playground] Add precompiling

[noreply] Merge pull request #16505 from [BEAM-13527] [Playground] Pipeline

[noreply] [BEAM-13293] XLang Jdbc IO for Go SDK (#16111)

[noreply] [BEAM-10206] Add Go Vet to Github Actions (#16612)


------------------------------------------
[...truncated 55.84 KB...]
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645956 sha256=871679619c820581fde61c7c59fe2eb2fa63f2b0410eac515fbea93e74fa9d15
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.46 botocore-1.23.46 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643471440.622221/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220129155040623140-9732'
 createTime: '2022-01-29T15:50:47.300174Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-29_07_50_46-3538351901961728664'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0129150449'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-29T15:50:47.300174Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-29_07_50_46-3538351901961728664]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-29_07_50_46-3538351901961728664
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-29_07_50_46-3538351901961728664?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-29_07_50_46-3538351901961728664 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:54.957Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.569Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.603Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.671Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.714Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.747Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.782Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.825Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.865Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.898Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.930Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.963Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:55.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.032Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.063Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.097Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.216Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.260Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.294Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.348Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.410Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.433Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:50:56.479Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:51:07.450Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:51:39.409Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:52:03.837Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T15:52:03.875Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-29_07_50_46-3538351901961728664 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 94ad226c7f934d38a65fdb6c27b3ce6b and timestamp: 1643472214.9014523:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 112
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0129150449.1643472219.095696/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220129160339096669-2703'
 createTime: '2022-01-29T16:03:45.191561Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-29_08_03_44-12437924282983203665'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0129150449'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-29T16:03:45.191561Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-29_08_03_44-12437924282983203665]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-29_08_03_44-12437924282983203665
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-29_08_03_44-12437924282983203665?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-29_08_03_44-12437924282983203665 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:56.257Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.390Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.417Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.482Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.556Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.614Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.691Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.756Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.806Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.834Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.866Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:57.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.050Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.418Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.644Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.729Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.816Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:58.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.005Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.095Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.253Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.322Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.359Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.393Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.459Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.501Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:03:59.534Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:04:19.346Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:04:46.661Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:05:11.826Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-29T16:05:11.858Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-29_08_03_44-12437924282983203665 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 076fec167d814b5e8d61698ab32121e0 and timestamp: 1643473102.163332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 241
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 076fec167d814b5e8d61698ab32121e0 and timestamp: 1643473102.163332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 241
WARNING:apache_beam.testing.load_tests.load_test_metrics_utils:Failed to publish metrics to InfluxDB. Received status code 500 with an error message: engine: error writing WAL entry: write /var/lib/influxdb/wal/beam_test_metrics/a_year/964/_00073.wal: file already closed
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-29_07_50_46-3538351901961728664?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-29_08_03_44-12437924282983203665?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_1babbb27-47d9-47fb-9f61-a0b47a8abdf7_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 1s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/atijxrghjaucu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #599

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/599/display/redirect?page=changes>

Changes:

[marcin.kuthan] Get rid of unnessecary logs for BigQuery streaming writes in

[dhuntsperger] added GitHub example references to Python multilang quickstart

[mmack] [adhoc] Test S3Options and AwsOptions for Sdk v2

[noreply] [BEAM-13537] Fix NPE in kafkatopubsub example (#16625)

[noreply] [BEAM-13740] update java_tests.yml to remove setup-go, which is

[Heejong Lee] Fix google3 import error

[noreply] [BEAM-12976] Implement Java projection pushdown optimizer. (#16513)

[noreply] Merge pull request #16579 from Revert "Revert "Merge pull request #15863

[noreply] Merge pull request #16606 from [BEAM-13247] [Playground] Embedding


------------------------------------------
[...truncated 55.16 KB...]
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645960 sha256=da8547de87dbb441048d7713589a3c0939717c140ee1ca0808689c0c0fdd990e
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.45 botocore-1.23.45 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.1 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385137.948072/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220128155217949033-4430'
 createTime: '2022-01-28T15:52:25.447907Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-28_07_52_24-733989477264354022'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0128150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-28T15:52:25.447907Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-28_07_52_24-733989477264354022]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-28_07_52_24-733989477264354022
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-28_07_52_24-733989477264354022?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-28_07_52_24-733989477264354022 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:32.882Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.795Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.829Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.890Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.923Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.954Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:33.989Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.023Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.110Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.142Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.167Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.196Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.224Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.256Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.283Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.321Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.426Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.461Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.496Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.523Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.601Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.627Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:52:34.665Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:53:06.212Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:53:14.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:53:42.888Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T15:53:42.914Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-28_07_52_24-733989477264354022 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 5e8bd493dd36404482e095edc745ca51 and timestamp: 1643385918.0735266:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 120
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0128150510.1643385922.578624/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220128160522579559-3812'
 createTime: '2022-01-28T16:05:28.986851Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-28_08_05_28-6361431707534997183'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0128150510'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-28T16:05:28.986851Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-28_08_05_28-6361431707534997183]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-28_08_05_28-6361431707534997183
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-28_08_05_28-6361431707534997183?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-28_08_05_28-6361431707534997183 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:37.035Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.107Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.144Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.254Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.384Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.408Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.498Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.560Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.625Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.675Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.714Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.757Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.800Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:38.964Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.002Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.035Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.068Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.100Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.173Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.256Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.289Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.322Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.426Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.465Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:39.520Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:05:54.603Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:06:24.204Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:06:50.640Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-28T16:06:50.685Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-28_08_05_28-6361431707534997183 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ab33bca81aa54a66a925121a73b4ab3d and timestamp: 1643386710.5572922:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 90
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: ab33bca81aa54a66a925121a73b4ab3d and timestamp: 1643386710.5572922:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 90
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_35c7f81a-a23f-440d-899b-b08da5af12e5_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-28_07_52_24-733989477264354022?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-28_08_05_28-6361431707534997183?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 10s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7qiwgb6eqp3mq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #598

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/598/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-13093] Enable JavaUsingPython CrossLanguageValidateRunner test for

[mmack] [BEAM-13746] Fix deserialization of SSECustomerKey for AWS Sdk v2

[noreply] [BEAM-7928] Allow users to specify worker disk type for Dataflow runner

[noreply] Merge pull request #16534 from [BEAM-13671][Playground] Add backend

[noreply] [BEAM-13271] Bump errorprone to 2.10.0 (#16231)

[noreply] [BEAM-13595] Don't load main session when cloudpickle is used. (#16589)

[Heejong Lee] Update readme for XVR tests


------------------------------------------
[...truncated 54.68 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645960 sha256=3072cf9c6122b0160dae873bf1640ab91f2ce896e63987f1133cfab198852f25
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.44 botocore-1.23.44 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643298640.031080/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220127155040031992-3623'
 createTime: '2022-01-27T15:50:46.186956Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-27_07_50_45-7028686670327411821'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0127150444'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-27T15:50:46.186956Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-27_07_50_45-7028686670327411821]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-27_07_50_45-7028686670327411821
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-27_07_50_45-7028686670327411821?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-27_07_50_45-7028686670327411821 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:54.757Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.297Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.406Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.491Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.533Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.599Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.642Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.674Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.707Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.744Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.777Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.812Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.843Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.869Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:55.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.027Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.049Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.085Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.117Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.166Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.193Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:50:56.224Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:51:11.528Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:51:36.988Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:52:08.079Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T15:52:08.114Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-27_07_50_45-7028686670327411821 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6c3b20406b7f4429817ea05c3af5adfe and timestamp: 1643299423.7711763:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 104
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0127150444.1643299428.221215/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220127160348222095-6893'
 createTime: '2022-01-27T16:03:55.515878Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-27_08_03_54-3682893659537096707'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0127150444'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-27T16:03:55.515878Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-27_08_03_54-3682893659537096707]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-27_08_03_54-3682893659537096707
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-27_08_03_54-3682893659537096707?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-27_08_03_54-3682893659537096707 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:03.387Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.705Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.735Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.791Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.861Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.898Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:04.956Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.011Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.044Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.107Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.140Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.175Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.266Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.290Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.384Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.444Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.473Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.497Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.539Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.585Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.616Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:05.674Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:36.230Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:04:49.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:05:14.885Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-27T16:05:14.913Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-27_08_03_54-3682893659537096707 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c13cee4afb064e84b34dc3c6c0499714 and timestamp: 1643300357.4634194:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 301
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c13cee4afb064e84b34dc3c6c0499714 and timestamp: 1643300357.4634194:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 301
Traceback (most recent call last):
  File "/home/jenkins/.pyenv/versions/3.7.10/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/jenkins/.pyenv/versions/3.7.10/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-27_07_50_45-7028686670327411821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-27_08_03_54-3682893659537096707?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_c9b8359e-a724-4fff-98fe-35a521348257_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 57s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gpwo3c5xmimku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #597

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/597/display/redirect?page=changes>

Changes:

[artur.khanin] Privacy policy update regarding Apache Beam Playground

[Daniel Oliveira] [BEAM-13321] Fix exception with BigQuery StreamWriter TraceID.

[mmack] [BEAM-8807] Add integration test for SnsIO.write (Sdk v1 & v2)

[noreply] [BEAM-13736] Make lifting cache exact. (#16603)

[noreply] Merge pull request #16565 from [BEAM-13692][Playground]  Implement

[noreply] Merge pull request #16502 from [BEAM-13650][Playground] Add link for

[noreply] [BEAM-13310] remove call to get offset consumer config, which was rep…


------------------------------------------
[...truncated 55.76 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645742 sha256=ea99bbfe646d3835533e7ad8903e25fa1c0dc6368d6e8b1bdecb1e51ee178d12
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.43 botocore-1.23.43 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.9 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643212232.583495/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220126155032584442-6691'
 createTime: '2022-01-26T15:50:39.705425Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-26_07_50_38-14699436394426011476'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0126150520'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-26T15:50:39.705425Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-26_07_50_38-14699436394426011476]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-26_07_50_38-14699436394426011476
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-26_07_50_38-14699436394426011476?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-26_07_50_38-14699436394426011476 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:48.744Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.614Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.634Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.701Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.753Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.782Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.814Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.847Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.895Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.917Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.953Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:49.988Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.012Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.239Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.267Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.292Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.323Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.387Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.418Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:50:50.464Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:51:08.842Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:51:30.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:51:56.573Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T15:51:56.611Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-26_07_50_38-14699436394426011476 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 32971d22bf9c407fa5b1f33b008137fe and timestamp: 1643213027.7804089:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 103
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0126150520.1643213032.279319/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220126160352280282-3930'
 createTime: '2022-01-26T16:03:59.157894Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-26_08_03_58-4417362137406888464'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0126150520'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-26T16:03:59.157894Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-26_08_03_58-4417362137406888464]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-26_08_03_58-4417362137406888464
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-26_08_03_58-4417362137406888464?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-26_08_03_58-4417362137406888464 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:09.936Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:16.706Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.109Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.175Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.242Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.273Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.350Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.414Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.442Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.468Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.500Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.533Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.634Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.669Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.734Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.798Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.831Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.883Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.915Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.937Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.958Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:20.989Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:21.044Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:21.076Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:21.118Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:04:35.765Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:05:05.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:05:29.508Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-26T16:05:29.544Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-26_08_03_58-4417362137406888464 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 90fce7ddf0844dd48efb382bcebf0377 and timestamp: 1643213981.421986:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 293
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 90fce7ddf0844dd48efb382bcebf0377 and timestamp: 1643213981.421986:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 293
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-26_07_50_38-14699436394426011476?project=apache-beam-testing
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-26_08_03_58-4417362137406888464?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_815fcdca-c33c-411d-b588-741844e7b13d_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 23s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/y446ypvgpxefe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #596

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/596/display/redirect?page=changes>

Changes:

[Heejong Lee] [BEAM-13716] Clear before creating a new virtual environment in

[mmack] [BEAM-13653] Make SnsIO.write topicArn optional. If provided, validate

[noreply] [BEAM-10897] Update the fastavro lower bound due to an issue on Windows

[noreply] [BEAM-13605] Update pandas_doctests_test denylists in preparation for

[noreply] Merge pull request #16538 from [BEAM-13676][Playground][Bugfix]Build Of

[noreply] Merge pull request #16582 from [BEAM-13711] [Playground] [Bugfix] Add

[noreply] Merge pull request #16515 from [BEAM-13636] [Playground] Checking the

[ningkang0957] [BEAM-13275] Removed the explicit selenium dependency from setup

[noreply] [BEAM-10206] Deprecate unused shallow cloning functions (#16600)

[noreply] Bump Dataflow container versions (#16602)

[noreply] Improved multi-language pipelines section of the programming guide

[mmack] [BEAM-13510] Don't retry on invalid SQS receipt handles.


------------------------------------------
[...truncated 50.24 KB...]
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.9.3-py2.py3-none-any.whl (27 kB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<5.0,>=4.4.0
  Using cached pytest-4.6.11-py2.py3-none-any.whl (231 kB)
Collecting pytest-xdist<2,>=1.29.0
  Using cached pytest_xdist-1.34.0-py2.py3-none-any.whl (36 kB)
Collecting pytest-timeout<2,>=1.3.3
  Using cached pytest_timeout-1.4.2-py2.py3-none-any.whl (10 kB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.31-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers<4.0.0,>=3.0.3
  Using cached testcontainers-3.4.2-py2.py3-none-any.whl (31 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.9.0-py2.py3-none-any.whl (356 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.21.1-py2.py3-none-any.whl (178 kB)
Collecting boto3>=1.9
  Using cached boto3-1.20.42-py3-none-any.whl (131 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.37.0.dev0) (1.16.0)
Collecting msrest>=0.6.21
  Using cached msrest-0.6.21-py2.py3-none-any.whl (85 kB)
Collecting cryptography>=2.1.4
  Using cached cryptography-36.0.1-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.8 MB)
Collecting s3transfer<0.6.0,>=0.5.0
  Using cached s3transfer-0.5.0-py3-none-any.whl (79 kB)
Collecting botocore<1.24.0,>=1.23.42
  Using cached botocore-1.23.42-py3-none-any.whl (8.5 MB)
Collecting jmespath<1.0.0,>=0.7.1
  Using cached jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
Collecting fasteners>=0.14
  Using cached fasteners-0.17.3-py3-none-any.whl (18 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.8-py3-none-any.whl (39 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]<3.0.0dev,>=1.29.0
  Using cached google_api_core-2.4.0-py2.py3-none-any.whl (111 kB)
Requirement already satisfied: packaging>=14.3 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<3,>=1.6.0->apache-beam==2.37.0.dev0) (21.3)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.1.0-py2.py3-none-any.whl (75 kB)
Collecting libcst>=0.2.5
  Using cached libcst-0.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.7 MB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.3-py3-none-any.whl
Collecting google-api-core[grpc]<3.0.0dev,>=1.29.0
  Using cached google_api_core-1.31.5-py2.py3-none-any.whl (93 kB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-auth<3,>=1.18.0->apache-beam==2.37.0.dev0) (60.5.0)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.1.0-py3-none-any.whl (14 kB)
Collecting grpcio-status>=1.18.0
  Using cached grpcio_status-1.43.0-py3-none-any.whl (10.0 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing<3,>=2.4.2
  Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting pbr>=0.11
  Using cached pbr-5.8.0-py2.py3-none-any.whl (112 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (2.1.3)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.12.0-py3-none-any.whl (54 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (1.11.0)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645742 sha256=897f3fa89143a3b86e1c4fa2fc8a102d3f6d9de8715109af068c2d6c5604a16a
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.42 botocore-1.23.42 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0125150517.1643125843.127833/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220125155043128777-5439'
 createTime: '2022-01-25T15:50:49.865778Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-25_07_50_49-13847257344179007252'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0125150517'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-25T15:50:49.865778Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-25_07_50_49-13847257344179007252]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-25_07_50_49-13847257344179007252
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-25_07_50_49-13847257344179007252?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-25_07_50_49-13847257344179007252 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.017Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.608Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.639Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.690Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.732Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.760Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.786Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.814Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.847Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.931Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.969Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.011Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.195Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.226Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.255Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.292Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.354Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.383Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.415Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:18.435Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:43.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:43.385Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:53.743Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:52:12.784Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:52:12.814Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
Terminated

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=ef589f68-0b7e-4bac-922a-78d1da90f258, currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 10466
  log file: /home/jenkins/.gradle/daemon/7.3.2/daemon-10466.out.log
----- Last  20 lines from daemon log file - daemon-10466.out.log -----
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:58.969Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.011Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.195Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.226Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.255Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.292Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.354Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.383Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:50:59.415Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:18.435Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:43.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:43.385Z: JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:51:53.743Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:52:12.784Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-25T15:52:12.814Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
Terminated
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #595

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/595/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest
ERROR: Issue with creating launcher for agent apache-beam-jenkins-9. The agent has not been fully initialized yet
[EnvInject] - Loading node environment variables.
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest
Building remotely on apache-beam-jenkins-9 (beam)ERROR: apache-beam-jenkins-9 seems to be offline
ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #594

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/594/display/redirect>

Changes:


------------------------------------------
[...truncated 56.41 KB...]
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645720 sha256=a7de60d15a5033db9f7a2820975c7eb44a36c4803c9473c3782ee6daa91a63e0
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.41 botocore-1.23.41 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953035.625831/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220123155035626802-9648'
 createTime: '2022-01-23T15:50:41.879459Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-23_07_50_41-1885146763587834879'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0123150451'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-23T15:50:41.879459Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-23_07_50_41-1885146763587834879]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-23_07_50_41-1885146763587834879
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-23_07_50_41-1885146763587834879?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-23_07_50_41-1885146763587834879 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:52.140Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:52.910Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:52.931Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:52.992Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.020Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.039Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.060Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.082Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.111Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.168Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.196Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.220Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.271Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.316Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.428Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.470Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.503Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.529Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.573Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.590Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:50:53.612Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:51:25.934Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:51:37.525Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:52:02.341Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T15:52:02.373Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-23_07_50_41-1885146763587834879 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8b44760450fb43e88d0540c71a95879c and timestamp: 1642953809.1890063:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 138
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0123150451.1642953812.732316/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220123160332733316-8506'
 createTime: '2022-01-23T16:03:39.551350Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-23_08_03_38-598633096860800519'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0123150451'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-23T16:03:39.551350Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-23_08_03_38-598633096860800519]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-23_08_03_38-598633096860800519
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-23_08_03_38-598633096860800519?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-23_08_03_38-598633096860800519 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:46.392Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.048Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.078Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.144Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.226Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.278Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.342Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.409Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.450Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.485Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.547Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.584Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.616Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.637Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.663Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.688Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.753Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.843Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.874Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.907Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.930Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:47.954Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:48.011Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:48.046Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:48.086Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:03:59.256Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:04:34.078Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:05:00.497Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-23T16:05:00.525Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-23_08_03_38-598633096860800519 after 604 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_d28fd1e3-63d5-456c-8d6f-e0d7b7220cf0_read_matcher.
Traceback (most recent call last):
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-23_07_50_41-1885146763587834879?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 149, in run
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-23_08_03_38-598633096860800519?project=apache-beam-testing
    self.result = self.pipeline.run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_d28fd1e3-63d5-456c-8d6f-e0d7b7220cf0_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 30s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ybgkjwqueuj22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #593

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/593/display/redirect?page=changes>

Changes:

[ningkang0957] [BEAM-13687] Improved Spanner IO request count metrics

[noreply] [BEAM-10206] Add key for fields in wrapper (#16583)

[noreply] Merge pull request #16530 from Adding JSON support in SpannerIO and

[noreply] [BEAM-13685] Enable users to specify cache directory under Interactive


------------------------------------------
[...truncated 55.21 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2645720 sha256=aca1f776588acb38471f5079ddcc9c7586d077514ce3b37ddb1b1a43be4a8151
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.41 botocore-1.23.41 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.6 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642866622.915429/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220122155022916418-3415'
 createTime: '2022-01-22T15:50:29.428010Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-22_07_50_28-6796213495083151030'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0122150504'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-22T15:50:29.428010Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-22_07_50_28-6796213495083151030]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-22_07_50_28-6796213495083151030
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-22_07_50_28-6796213495083151030?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-22_07_50_28-6796213495083151030 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:46.766Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.781Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.811Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.881Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.940Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:47.972Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.014Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.104Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.149Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.194Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.258Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.331Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.357Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.380Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.470Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.504Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.526Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.558Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.604Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.642Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:50:48.681Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:51:06.518Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:51:33.696Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:51:58.861Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T15:51:58.892Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-22_07_50_28-6796213495083151030 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: bad9b20a487a459487bfb2f07a563dc8 and timestamp: 1642867417.064701:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 93
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0122150504.1642867419.856575/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220122160339857577-8074'
 createTime: '2022-01-22T16:03:47.170601Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-22_08_03_46-5561093664689412717'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0122150504'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-22T16:03:47.170601Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-22_08_03_46-5561093664689412717]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-22_08_03_46-5561093664689412717
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-22_08_03_46-5561093664689412717?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-22_08_03_46-5561093664689412717 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:03:59.113Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.235Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.264Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.329Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.405Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.442Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.506Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.594Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.641Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:04.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.634Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.805Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.849Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.909Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:05.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.092Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.156Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.210Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.234Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.272Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.339Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.410Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.451Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:06.485Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:04:25.569Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:05:01.271Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:05:25.787Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-22T16:05:26.262Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-22_08_03_46-5561093664689412717 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e718f2ae355b47119694c18ab72810fe and timestamp: 1642868409.1256104:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 342
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e718f2ae355b47119694c18ab72810fe and timestamp: 1642868409.1256104:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 342
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-22_07_50_28-6796213495083151030?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-22_08_03_46-5561093664689412717?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_d09aef4d-a79d-45b3-b5ee-6751ca595640_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 50s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wqlfwwqy7iofq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #592

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/592/display/redirect?page=changes>

Changes:

[thiagotnunes] [BEAM-12164]: Add SDF for reading change stream records

[tuyarer] [BEAM-13577] Beam Select's uniquifyNames function loses nullability of

[sergey.kalinin] Update GH Actions to use proper variables names and proper triggers

[dhuntsperger] edited README and comments in Python multi-lang pipes examples

[Pablo Estrada] Revert "Merge pull request #15863 from [BEAM-13184] Autosharding for

[Pablo Estrada] BEAM-13611 reactivating jdbcio xlang test

[Steve Niemitz] [BEAM-13689] Output token elements when BQ batch writes complete.

[noreply] Merge pull request #16371 from [BEAM-13518][Playground] Beam Playground

[noreply] Update Java FnAPI beam master (#16572)

[noreply] [BEAM-13699] Replace fnv with maphash. (#16573)

[noreply] [BEAM-13693] Bump

[noreply] [BEAM-10206] Remove Fatalf calls in non-test goroutines for

[noreply] [BEAM-13430] Re-add provided configuration (#16552)

[noreply] Merge pull request #16540 from [BEAM-13678][Playground]Update Github

[noreply] Merge pull request #16546 from [BEAM-13661] [BEAM-13704] [Playground]

[noreply] Merge pull request #16369 from [BEAM-13558] [Playground] Hide the Graph


------------------------------------------
[...truncated 55.55 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644449 sha256=eaa549db857e755e25678322ecca1813ca9bef3e9d36b38de723468a0b408c6f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.40 botocore-1.23.40 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.31 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642780235.309983/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220121155035310928-2074'
 createTime: '2022-01-21T15:50:42.509041Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-21_07_50_41-17395325955460464067'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0121150441'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-21T15:50:42.509041Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-21_07_50_41-17395325955460464067]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-21_07_50_41-17395325955460464067
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-21_07_50_41-17395325955460464067?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-21_07_50_41-17395325955460464067 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:49.555Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.231Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.266Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.338Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.371Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.404Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.431Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.473Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.510Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.590Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.615Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.638Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.718Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.803Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.833Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.872Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.898Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.950Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:52.984Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:50:53.038Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:51:16.694Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:51:37.156Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:52:04.132Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T15:52:04.170Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-21_07_50_41-17395325955460464067 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 83c8257dc376417a9e790fedacccab53 and timestamp: 1642781017.5752041:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0121150441.1642781022.179049/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220121160342179952-7422'
 createTime: '2022-01-21T16:03:49.597137Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-21_08_03_48-12179411668815968254'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0121150441'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-21T16:03:49.597137Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-21_08_03_48-12179411668815968254]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-21_08_03_48-12179411668815968254
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-21_08_03_48-12179411668815968254?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-21_08_03_48-12179411668815968254 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:57.075Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.146Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.200Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.261Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.347Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.374Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.446Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.516Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.563Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.590Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.623Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.653Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.731Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.809Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.834Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.893Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:58.967Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.002Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.077Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.099Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.133Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.210Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.261Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.300Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:03:59.353Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:04:26.287Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:04:48.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:05:17.942Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-21T16:05:17.978Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-21_08_03_48-12179411668815968254 after 601 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 40e13875983f4f3cbee5522a92e7e098 and timestamp: 1642781809.328137:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 40e13875983f4f3cbee5522a92e7e098 and timestamp: 1642781809.328137:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 88
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-21_07_50_41-17395325955460464067?project=apache-beam-testing
    PubsubReadPerfTest().run()
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-21_08_03_48-12179411668815968254?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_8be41088-7db9-4ebb-8d1b-80f85821c622_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 28s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w3dvz425xwcbs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #591

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/591/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16370 from [BEAM-13556] playground - color and

[noreply] Merge pull request #16531 from [BEAM-13567] [playground] Handle run code

[noreply] Merge pull request #16533 from [BEAM-13548] [Playground] Add example

[noreply] Merge pull request #16519 from [BEAM-13639] [Playground] Add

[noreply] Merge pull request #16518 from [BEAM-13619] [Playground] Add loading

[noreply] Merge pull request #16243 from

[noreply] [BEAM-13683] Make cross-language SQL example pipeline (#16567)

[noreply] [BEAM-13688] fixed type in BPG 4.5.3 window section (#16560)

[noreply] Remove obsolete commands from Inventory job. (#16564)

[noreply] Disable logging for memoization test. (#16556)

[noreply] Merge pull request #16472: [BEAM-13697] Add SchemaFieldNumber annotation

[noreply] Merge pull request #16373 from [BEAM-13515] [Playground] Hiding lines in


------------------------------------------
[...truncated 55.24 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644478 sha256=434edc3178119e0855902c1bc53a96591f3436c43f496a91e191ba7f4db33dcf
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.39 botocore-1.23.39 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.30 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642693821.206981/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220120155021208017-8272'
 createTime: '2022-01-20T15:50:29.043659Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-20_07_50_28-9919287957370941559'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0120150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-20T15:50:29.043659Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-20_07_50_28-9919287957370941559]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-20_07_50_28-9919287957370941559
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-20_07_50_28-9919287957370941559?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-20_07_50_28-9919287957370941559 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:42.158Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:44.721Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:44.760Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.470Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.532Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.570Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.616Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.667Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.713Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.770Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.804Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.855Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.889Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:45.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:46.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:46.153Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:46.191Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:46.477Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:46.780Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:47Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:47.038Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:50:47.115Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:51:11.265Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:51:33.900Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:51:59.905Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T15:52:00.070Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-20_07_50_28-9919287957370941559 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: e0de8f30c6324ef0970bfbf746bc892c and timestamp: 1642694616.4104111:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 95
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0120150512.1642694621.286284/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220120160341287226-6946'
 createTime: '2022-01-20T16:03:48.157453Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-20_08_03_47-10846025921520529544'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0120150512'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-20T16:03:48.157453Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-20_08_03_47-10846025921520529544]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-20_08_03_47-10846025921520529544
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-20_08_03_47-10846025921520529544?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-20_08_03_47-10846025921520529544 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:54.946Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:55.875Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:55.928Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:55.994Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.074Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.107Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.198Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.277Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.312Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.337Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.381Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.424Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.479Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.605Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.655Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.704Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.774Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.816Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.865Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.932Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.958Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:56.993Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:57.026Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:57.076Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:57.110Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:03:57.162Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:04:04.855Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:04:40.715Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:05:08.392Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-20T16:05:08.425Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-20_08_03_47-10846025921520529544 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6d35d20647bb41b0a03cb5540f6170bf and timestamp: 1642695680.1594524:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 396
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6d35d20647bb41b0a03cb5540f6170bf and timestamp: 1642695680.1594524:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 396
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-20_07_50_28-9919287957370941559?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-20_08_03_47-10846025921520529544?project=apache-beam-testing
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_96c404f8-8013-4b44-b420-7cb5269a912a_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 32m 1s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p7x5666goeixe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #590

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/590/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Bump beam container version.

[alexander.zhuravlev] [BEAM-13680] Fixed code_repository (added pipelineUuid to RunCodeResult

[Robert Bradshaw] Also bump FnAPI container.

[noreply] [BEAM-13616][BEAM-13645] Switch to vendored grpc 1.43.2 (#16543)

[noreply] [BEAM-13616][BEAM-13646] Upgrade vendored calcite to 1.28.0:0.2 (#16544)

[noreply] Merge pull request #16486 from [BEAM-13544][Playground] Add logs to

[noreply] [BEAM-13683] Correct SQL transform schema, fix expansion address

[noreply] Update walkthrough.md (#16512)

[noreply] [BEAM-11808][BEAM-9879] Support aggregate functions with two arguments

[noreply] Merge pull request #16506 from [BEAM-13652][Playground] Send examples'

[noreply] Merge pull request #16322 from [BEAM-13407] [Playground] Preload fonts

[noreply] [BEAM-13665] Make SpannerIO projectID optional again (#16547)

[noreply] [BEAM-13015] Add state caching capability to be used as hint for runners

[noreply] Merge pull request #16309: [BEAM-13503] Set a default value to

[noreply] [BEAM-13015] Provide caching statistics in the status client. (#16495)

[noreply] [BEAM-13611] Skip test_xlang_jdbc_write (#16554)


------------------------------------------
[...truncated 55.08 KB...]
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644396 sha256=e4b79c11643320320e3d40a590bc3f437bc523a16a17650b473a0f97a9347ec7
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam

> Task :runners:google-cloud-dataflow-java:****:compileJava

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.38 botocore-1.23.38 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.5.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :runners:google-cloud-dataflow-java:****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:****:classes
> Task :runners:google-cloud-dataflow-java:****:shadowJar

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642607526.628597/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220119155206629566-4347'
 createTime: '2022-01-19T15:52:12.901751Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-19_07_52_11-16525610744213100057'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0119150501'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-19T15:52:12.901751Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-19_07_52_11-16525610744213100057]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-19_07_52_11-16525610744213100057
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-19_07_52_11-16525610744213100057?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-19_07_52_11-16525610744213100057 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:19.858Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.741Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.786Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.854Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.891Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.916Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.955Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:20.988Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.058Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.115Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.147Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.180Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.254Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.479Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.518Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.547Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.613Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.676Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.709Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:21.734Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:52:40.477Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:53:06.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:53:32.861Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T15:53:32.901Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-19_07_52_11-16525610744213100057 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 8ca86a9c98dd4b0fa06a202735a32112 and timestamp: 1642608315.6434023:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 98
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220117" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0119150501.1642608321.179783/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220119160521180729-4864'
 createTime: '2022-01-19T16:05:27.259483Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-19_08_05_26-6841291188933287151'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0119150501'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-19T16:05:27.259483Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-19_08_05_26-6841291188933287151]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-19_08_05_26-6841291188933287151
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-19_08_05_26-6841291188933287151?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-19_08_05_26-6841291188933287151 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:35.974Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.672Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.698Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.790Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.881Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.923Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:36.993Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.058Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.128Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.197Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.229Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.256Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.394Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.428Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.470Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.519Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.557Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.632Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.740Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.785Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.818Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.851Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.919Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.949Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:37.983Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:05:49.818Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:06:17.784Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:06:43.889Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-19T16:06:43.916Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-19_08_05_26-6841291188933287151 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 704d8608ef1a4d79b0c4a8e8807d32a6 and timestamp: 1642609203.0576816:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 257
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 704d8608ef1a4d79b0c4a8e8807d32a6 and timestamp: 1642609203.0576816:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 257
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-19_07_52_11-16525610744213100057?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-19_08_05_26-6841291188933287151?project=apache-beam-testing
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_adfa1cf5-b66b-46be-9443-28a3ad9b1c17_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 41s
92 actionable tasks: 59 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/62rfpedhooayg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #589

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/589/display/redirect?page=changes>

Changes:

[mmack] [BEAM-8806] Integration test for SqsIO

[mmack] [BEAM-13631] Add deterministic SQS message coder to fix reading from SQS

[aydar.zaynutdinov] [BEAM-13641][Playground]

[noreply] Remove jcenter repositories from gradle configuration. (#16532)

[noreply] [BEAM-13430] Remove jcenter which will no longer contain any updates.

[noreply] [BEAM-13616] Update com.google.cloud:libraries-bom to 24.2.0 (#16509)


------------------------------------------
[...truncated 55.22 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644276 sha256=df0ee03206c5da9f15ec1450c59f139f6bd31f9dd0bb1cb77fdd7d5ced86707d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.37 botocore-1.23.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.4.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521021.466796/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220118155021467785-7535'
 createTime: '2022-01-18T15:50:28.125310Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-18_07_50_27-15444666766888493184'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0118150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-18T15:50:28.125310Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-18_07_50_27-15444666766888493184]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-18_07_50_27-15444666766888493184
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-18_07_50_27-15444666766888493184?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-18_07_50_27-15444666766888493184 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:39.169Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.393Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.433Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.502Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.530Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.558Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.593Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.636Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.674Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.717Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.754Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.787Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.877Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.917Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:40.951Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.069Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.118Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.144Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.167Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.227Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.264Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:50:41.308Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:51:02.186Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:51:26.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:51:52.200Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T15:51:52.237Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-18_07_50_27-15444666766888493184 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 90ec3423bf6e42caa1fcc4087fa55d8d and timestamp: 1642521801.0458632:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 96
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/dataflow-****.jar in 4 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0118150516.1642521804.694515/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220118160324695454-9277'
 createTime: '2022-01-18T16:03:30.889131Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-18_08_03_30-18035094016925129683'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0118150516'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-18T16:03:30.889131Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-18_08_03_30-18035094016925129683]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-18_08_03_30-18035094016925129683
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-18_08_03_30-18035094016925129683?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-18_08_03_30-18035094016925129683 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:52.807Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.178Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.215Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.271Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.328Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.344Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.395Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.460Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.489Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.516Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.548Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.638Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.669Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.732Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.839Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.874Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.908Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.941Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:55.978Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.029Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.062Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.097Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.150Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.185Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:03:56.208Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:04:19.774Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:04:42.437Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:05:05.371Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-18T16:05:05.411Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-18_08_03_30-18035094016925129683 after 600 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4f01c48a6dca4a99b6e2814e962355ce and timestamp: 1642522832.7137494:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 376
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 4f01c48a6dca4a99b6e2814e962355ce and timestamp: 1642522832.7137494:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 376
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_d9a55022-fb40-44bf-912c-de9d7cb9d9c9_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-18_07_50_27-15444666766888493184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-18_08_03_30-18035094016925129683?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 31m 13s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pborjotwwntii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #588

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/588/display/redirect?page=changes>

Changes:

[mmack] [BEAM-8806] Integration test for SqsIO using Localstack

[noreply] Merge pull request #16507: [BEAM-13137] Fixes ES utest size flakiness


------------------------------------------
[...truncated 55.22 KB...]
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644276 sha256=49317aae201275a0914e7f72b0cc72362fcf8006aec6c58730a7a581a1646103
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.37 botocore-1.23.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.17.2 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.4.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642434621.289435/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220117155021290360-6044'
 createTime: '2022-01-17T15:50:27.995129Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-17_07_50_27-9565452200299987983'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0117150457'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-17T15:50:27.995129Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-17_07_50_27-9565452200299987983]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-17_07_50_27-9565452200299987983
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-17_07_50_27-9565452200299987983?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-17_07_50_27-9565452200299987983 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:41.413Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.088Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.475Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.540Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.583Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.608Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.645Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.679Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.724Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.783Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.815Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.849Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.872Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:47.956Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.069Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.106Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.139Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.175Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.234Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.263Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:48.310Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:50:58.698Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:51:38.785Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:52:04.066Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T15:52:04.099Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-17_07_50_27-9565452200299987983 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6ae6ce2bcfb744caa4f990bd0df90919 and timestamp: 1642435410.4398456:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0117150457.1642435415.020823/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220117160335021748-6823'
 createTime: '2022-01-17T16:03:41.407808Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-17_08_03_40-8131888755316910479'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0117150457'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-17T16:03:41.407808Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-17_08_03_40-8131888755316910479]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-17_08_03_40-8131888755316910479
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-17_08_03_40-8131888755316910479?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-17_08_03_40-8131888755316910479 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:49.321Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.446Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.471Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.546Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.618Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.644Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.710Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.785Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.826Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.860Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:50.983Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.019Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.155Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.189Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.233Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.298Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.353Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.391Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.438Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.472Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.528Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.560Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:03:51.589Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:04:18.082Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:04:37.074Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:05:02.486Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-17T16:05:02.517Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-17_08_03_40-8131888755316910479 after 604 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c8ab147a7b1c4a379e083635a7ac2717 and timestamp: 1642436216.3611407:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 105
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: c8ab147a7b1c4a379e083635a7ac2717 and timestamp: 1642436216.3611407:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 105
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_220c6460-8b87-444f-a62f-2aa3b3d6da84_read'
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-17_07_50_27-9565452200299987983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-17_08_03_40-8131888755316910479?project=apache-beam-testing

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 27m 35s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://scans.gradle.com/s/xoujrczzcwuf2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #587

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/587/display/redirect>

Changes:


------------------------------------------
[...truncated 55.73 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644276 sha256=ecfc55268ec7c8c270cb853d592d867eb786317fdf9374256ed51878552134bd
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.37 botocore-1.23.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.16.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.4.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/dataflow-****.jar in 6 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642348236.096625/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220116155036097600-2435'
 createTime: '2022-01-16T15:50:44.076998Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-16_07_50_42-4533271019742692201'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0116150509'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-16T15:50:44.076998Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-16_07_50_42-4533271019742692201]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-16_07_50_42-4533271019742692201
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-16_07_50_42-4533271019742692201?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-16_07_50_42-4533271019742692201 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.162Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.763Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.794Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.849Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.884Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.910Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.949Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:51.979Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.013Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.033Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.062Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.104Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.141Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.231Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.308Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.353Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.373Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.402Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.447Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.473Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:50:52.504Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:51:12.129Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:51:36.133Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:52:01.542Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T15:52:01.560Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-16_07_50_42-4533271019742692201 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d5f775f11d554a04b647bb99f6f079be and timestamp: 1642349026.3975055:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 97
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0116150509.1642349030.771955/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220116160350772852-8892'
 createTime: '2022-01-16T16:03:57.465778Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-16_08_03_56-9720108847155444465'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0116150509'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-16T16:03:57.465778Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-16_08_03_56-9720108847155444465]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-16_08_03_56-9720108847155444465
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-16_08_03_56-9720108847155444465?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-16_08_03_56-9720108847155444465 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:05.324Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.055Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.089Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.213Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.268Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.296Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.397Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.492Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.539Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.563Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.604Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.657Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.713Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.741Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.788Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.822Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.959Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:07.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.084Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.130Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.174Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.206Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.272Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.300Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:08.343Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:44.145Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:04:56.023Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:05:19.841Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-16T16:05:19.874Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-16_08_03_56-9720108847155444465 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 40fe400d7b624933a2633983f847b727 and timestamp: 1642349927.597631:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 258
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 40fe400d7b624933a2633983f847b727 and timestamp: 1642349927.597631:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 258
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-16_07_50_42-4533271019742692201?project=apache-beam-testing
    exec(code, run_globals)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-16_08_03_56-9720108847155444465?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_640ed350-0f3c-4c13-8762-ebb6e74fa350_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 27s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qkxknoug6sqmc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_PubsubIOIT_Python_Streaming #586

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/586/display/redirect?page=changes>

Changes:

[daria.malkova] Split builder into several builder for each step of pipeline execution

[Valentyn Tymofieiev] Provide API to check whether a hint is known.

[Valentyn Tymofieiev] [BEAM-12558] Fix doc typo.

[arietis27] [BEAM-13400] JDBC IO does not support UUID and JSONB PostgreSQL types

[jrmccluskey] [BEAM-10206] Resolve go vet errors in protox package

[noreply] Merge pull request #16482 from [BEAM-13429][Playground] Add builder for

[noreply] [BEAM-13590] Fix  abc imports from collections (#15850)

[jrmccluskey] Fix staticcheck errors in transforms directory

[jrmccluskey] Remove unnecessary fmt.Sprintf() in partition.go

[jrmccluskey] Replace bytes.Compare() with bytes.Equal() in test cases

[jrmccluskey] Replace string(buf.Bytes()) with buf.String() in coder_test.go

[jrmccluskey] Remove unnecessary blank identifier assignment in harness.go

[jrmccluskey] fix capitalized error strings in expansionx

[jrmccluskey] Clean up string cast of bytes in vet.go and corresponding tests

[jrmccluskey] Remove unnecessary fmt call in universal.go

[Robert Bradshaw] Remove tab from source.

[noreply] Redirecting cross-language transforms content (#16504)

[noreply] doc tweaks (#16498)

[noreply] [BEAM-12621] - Update Jenkins VMs to modern Ubuntu version (#16457)

[noreply] [BEAM-13664] Fix Primitives hashing benchmark (#16523)


------------------------------------------
[...truncated 55.73 KB...]
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.10-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 kB)
Collecting docker
  Using cached docker-5.0.3-py2.py3-none-any.whl (146 kB)
Collecting wrapt
  Using cached wrapt-1.13.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (79 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.37.0.dev0) (3.7.0)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.3-py3-none-any.whl (53 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.37.0.dev0-py3-none-any.whl size=2644276 sha256=447f7ddf1028c51007cbbd6673a6ef8a54780bbee43e63e9b57f57b42faf0c1d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pyasn1-modules, idna, charset-normalizer, certifi, cachetools, typing-extensions, requests, pytz, mypy-extensions, googleapis-common-protos, google-auth, wcwidth, typing-inspect, pyyaml, python-dateutil, pycparser, pluggy, oauthlib, more-itertools, jmespath, google-api-core, attrs, atomicwrites, websocket-client, typing-utils, requests-oauthlib, pytest, proto-plus, numpy, libcst, isodate, httplib2, grpcio-gcp, grpc-google-iam-v1, google-crc32c, docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, pbr, overrides, orjson, oauth2client, msrest, hdfs, grpcio-status, greenlet, google-resumable-media, google-cloud-pubsub, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, cloudpickle, azure-core, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, mock, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsublite, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.6
    Uninstalling pyparsing-3.0.6:
      Successfully uninstalled pyparsing-3.0.6
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.37.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 azure-core-1.21.1 azure-storage-blob-12.9.0 boto3-1.20.37 botocore-1.23.37 cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.10 cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.9 fasteners-0.16.3 freezegun-1.1.0 google-api-core-1.31.5 google-apitools-0.5.31 google-auth-1.35.0 google-cloud-bigquery-2.32.0 google-cloud-bigquery-storage-2.11.0 google-cloud-bigtable-1.7.0 google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.4.0 google-cloud-language-1.3.0 google-cloud-pubsub-2.9.0 google-cloud-pubsublite-1.3.0 google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.1.0 googleapis-common-protos-1.54.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 grpcio-status-1.43.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 isodate-0.6.1 jmespath-0.10.0 libcst-0.4.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 mypy-extensions-0.4.3 numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.1.1 orjson-3.6.5 overrides-6.1.0 pandas-1.3.5 parameterized-0.7.5 pbr-5.8.0 pluggy-0.13.1 proto-plus-1.19.8 psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 requests-2.27.1 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.8 s3transfer-0.5.0 sqlalchemy-1.4.29 tenacity-5.1.5 testcontainers-3.4.2 typing-extensions-4.0.1 typing-inspect-0.7.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 websocket-client-1.2.3 wrapt-1.13.3

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642261833.728492/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220115155033729502-2796'
 createTime: '2022-01-15T15:50:40.119523Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-15_07_50_39-807239498551424528'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0115150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-15T15:50:40.119523Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-15_07_50_39-807239498551424528]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-15_07_50_39-807239498551424528
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-15_07_50_39-807239498551424528?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-15_07_50_39-807239498551424528 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:46.765Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:47.875Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:47.898Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:47.942Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:47.967Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:47.989Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.009Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.034Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.062Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.089Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.115Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.135Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.203Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.225Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.295Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.313Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.339Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.365Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.434Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.450Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:50:48.489Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:51:04.957Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:51:37.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:52:02.481Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T15:52:02.509Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-15_07_50_39-807239498551424528 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 6ea9e72995bd4d06b9d779b199899128 and timestamp: 1642262622.585378:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 99
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.37.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20211222" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/pickled_main_session in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/dataflow-****.jar in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0115150459.1642262626.620154/pipeline.pb in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220115160346621076-5745'
 createTime: '2022-01-15T16:03:53.539490Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-01-15_08_03_52-16397362782363953184'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0115150459'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-01-15T16:03:53.539490Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2022-01-15_08_03_52-16397362782363953184]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2022-01-15_08_03_52-16397362782363953184
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-15_08_03_52-16397362782363953184?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-01-15_08_03_52-16397362782363953184 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:00.647Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.580Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.674Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.778Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.903Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.922Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:01.980Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.042Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.074Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.120Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.170Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.298Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.327Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.377Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.429Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.470Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.495Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.540Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.572Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.628Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.658Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:02.692Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:35.669Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:04:55.021Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:05:21.416Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-01-15T16:05:21.443Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2022-01-15_08_03_52-16397362782363953184 after 602 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1c36c85f66544161a276aba624fe3925 and timestamp: 1642263578.7398694:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 309
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: 1c36c85f66544161a276aba624fe3925 and timestamp: 1642263578.7398694:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_read_runtime Value: 309
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-15_07_50_39-807239498551424528?project=apache-beam-testing
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 223, in <module>
    PubsubReadPerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 154, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-15_08_03_52-16397362782363953184?project=apache-beam-testing
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/pubsub_v1/_gapic.py",> line 43, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py",> line 817, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py",> line 498, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 'projects/apache-beam-testing/subscriptions/pubsub_io_performance_8ae91e61-fa57-461d-8eb3-7fa0a5fe3337_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 18s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://scans.gradle.com/s/2pjrfnwpvagey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org