You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/12/06 20:01:31 UTC

Build failed in Jenkins: beam_LoadTests_Python_CoGBK_Dataflow_Streaming #109

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/109/display/redirect>

Changes:


------------------------------------------
[...truncated 37.30 KB...]
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting pycparser
  Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting pydot<2,>=1.2.0
  Using cached pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9
  Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting pymongo<4.0.0,>=3.8.0
  Using cached pymongo-3.11.2-cp37-cp37m-manylinux2014_x86_64.whl (512 kB)
Collecting pyparsing>=2.1.4
  Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting pytest<5.0,>=4.4.0
  Using cached pytest-4.6.11-py2.py3-none-any.whl (231 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (1.9.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (3.1.1)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (0.13.1)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Collecting attrs>=17.4.0
  Using cached attrs-20.3.0-py2.py3-none-any.whl (49 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (3.4.0)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.6.0-py3-none-any.whl (45 kB)
Collecting packaging
  Using cached packaging-20.7-py2.py3-none-any.whl (35 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (3.1.1)
Collecting pytest-timeout<2,>=1.3.3
  Using cached pytest_timeout-1.4.2-py2.py3-none-any.whl (10 kB)
Collecting pytest-xdist<2,>=1.29.0
  Using cached pytest_xdist-1.34.0-py2.py3-none-any.whl (36 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting execnet>=1.1
  Using cached execnet-1.7.1-py2.py3-none-any.whl (39 kB)
Collecting apipkg>=1.4
  Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.3.0-py2.py3-none-any.whl (4.7 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.27.0.dev0) (1.9.0)
Collecting python-dateutil<3,>=2.8.0
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting pytz>=2018.3
  Using cached pytz-2020.4-py2.py3-none-any.whl (509 kB)
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp37-cp37m-linux_x86_64.whl
Collecting requests<3.0.0,>=2.24.0
  Using cached requests-2.25.0-py2.py3-none-any.whl (61 kB)
Collecting chardet<4,>=3.0.2
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting idna<3,>=2.5
  Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.8.0-py2.py3-none-any.whl (23 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.6-py3-none-any.whl (47 kB)
Collecting s3transfer<0.4.0,>=0.3.0
  Using cached s3transfer-0.3.3-py2.py3-none-any.whl (69 kB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.3.20-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting testcontainers<4.0.0,>=3.0.3
  Using cached testcontainers-3.1.0-py2.py3-none-any.whl
Collecting blindspin
  Using cached blindspin-2.0.1-cp37-none-any.whl
Collecting crayons
  Using cached crayons-0.4.0-py2.py3-none-any.whl (4.6 kB)
Collecting colorama
  Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker
  Using cached docker-4.4.0-py2.py3-none-any.whl (146 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting typing-extensions<3.8.0,>=3.7.0
  Using cached typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.6.0-py3-none-any.whl (8.1 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Collecting urllib3<1.27,>=1.25.4
  Using cached urllib3-1.26.2-py2.py3-none-any.whl (136 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-0.57.0-py2.py3-none-any.whl (200 kB)
Requirement already satisfied: six>=1.6 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.27.0.dev0) (1.15.0)
Collecting wrapt
  Using cached wrapt-1.12.1-cp37-cp37m-linux_x86_64.whl
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.27.0.dev0-py3-none-any.whl size=2375617 sha256=e111b83a6c4bee54574dbf94e67063bd756ac410941c3e038285a546be6309db
  Stored in directory: /home/jenkins/.cache/pip/wheels/e9/b7/24/91d8a1ca1043324f0be07987d2ad0a06e5e9f339b09a468f5b
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyparsing, pycparser, pyasn1-modules, idna, chardet, certifi, cachetools, wcwidth, typing-extensions, requests, pytz, python-dateutil, packaging, oauthlib, mypy-extensions, more-itertools, jmespath, googleapis-common-protos, google-auth, cffi, attrs, atomicwrites, websocket-client, typing-inspect, requests-oauthlib, pyyaml, pytest, pbr, numpy, monotonic, isodate, httplib2, grpcio-gcp, google-crc32c, google-api-core, docopt, colorama, botocore, apipkg, wrapt, s3transfer, pytest-forked, pymongo, pydot, pyarrow, proto-plus, oauth2client, nose, msrest, mock, libcst, hdfs, grpc-google-iam-v1, google-resumable-media, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, crayons, blindspin, azure-core, avro-python3, testcontainers, tenacity, sqlalchemy, requests-mock, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, nose-xunitmp, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-build, google-cloud-bigtable, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
Successfully installed apache-beam-2.27.0.dev0 apipkg-1.5 atomicwrites-1.4.0 attrs-20.3.0 avro-python3-1.9.2.1 azure-core-1.9.0 azure-storage-blob-12.6.0 blindspin-2.0.1 boto3-1.16.30 botocore-1.19.30 cachetools-4.1.1 certifi-2020.12.5 cffi-1.14.4 chardet-3.0.4 colorama-0.4.4 crayons-0.4.0 crcmod-1.7 cryptography-3.2.1 deprecation-2.1.0 dill-0.3.1.1 docker-4.4.0 docopt-0.6.2 execnet-1.7.1 fastavro-1.2.1 fasteners-0.15 freezegun-1.0.0 google-api-core-1.23.0 google-apitools-0.5.31 google-auth-1.23.0 google-cloud-bigquery-1.28.0 google-cloud-bigtable-1.6.1 google-cloud-build-2.0.0 google-cloud-core-1.4.4 google-cloud-datastore-1.15.3 google-cloud-dlp-1.0.0 google-cloud-language-1.3.0 google-cloud-pubsub-1.7.0 google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 google-crc32c-1.0.0 google-resumable-media-1.1.0 googleapis-common-protos-1.52.0 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 hdfs-2.5.8 httplib2-0.17.4 idna-2.10 isodate-0.6.0 jmespath-0.10.0 libcst-0.3.15 mock-2.0.0 monotonic-1.5 more-itertools-8.6.0 msrest-0.6.19 mypy-extensions-0.4.3 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.19.4 oauth2client-4.1.3 oauthlib-3.1.0 packaging-20.7 pandas-1.1.4 parameterized-0.7.4 pbr-5.5.1 proto-plus-1.11.0 psycopg2-binary-2.8.6 pyarrow-2.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.20 pydot-1.4.1 pyhamcrest-1.10.1 pymongo-3.11.2 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.3.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.1 pytz-2020.4 pyyaml-5.3.1 requests-2.25.0 requests-mock-1.8.0 requests-oauthlib-1.3.0 rsa-4.6 s3transfer-0.3.3 sqlalchemy-1.3.20 tenacity-5.1.5 testcontainers-3.1.0 typing-extensions-3.7.4.3 typing-inspect-0.6.0 urllib3-1.26.2 wcwidth-0.2.5 websocket-client-0.57.0 wrapt-1.12.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.7_sdk:2.27.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-cogbk-1-1206150317.1607272765.263620/pipeline.pb...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-cogbk-1-1206150317.1607272765.263620/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-cogbk-1-1206150317.1607272765.263620/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-cogbk-1-1206150317.1607272765.263620/dataflow_python_sdk.tar in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--co_input_options={"num_records": 2000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1000,"hot_key_fraction": 1}', '--iterations=1']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--co_input_options={"num_records": 2000000,"key_size": 10,"value_size": 90,"num_hot_keys": 1000,"hot_key_fraction": 1}', '--iterations=1']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-12-06T16:39:27.640421Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-12-06_08_39_25-414800489272855928'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-cogbk-1-1206150317'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-12-06T16:39:27.640421Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-12-06_08_39_25-414800489272855928]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-12-06_08_39_25-414800489272855928
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-06_08_39_25-414800489272855928?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-06_08_39_25-414800489272855928 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:25.897Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled. Use --experiments=disable_runner_v2 to opt out.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:25.897Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use --experiments=disable_streaming_engine to opt out.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:30.160Z: JOB_MESSAGE_BASIC: Worker configuration: n1-highmem-4 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:30.907Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:30.934Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:30.981Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.005Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step CoGroupByKey /GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.031Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Read pc1/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.054Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Read pc2/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.075Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.093Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.185Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.279Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.317Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.338Z: JOB_MESSAGE_DETAILED: Unzipping flatten s21 for input s19.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.363Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of CoGroupByKey /GroupByKey/WriteStream, through flatten CoGroupByKey /Flatten, into producer CoGroupByKey /pair_with_pc1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.421Z: JOB_MESSAGE_DETAILED: Fusing consumer CoGroupByKey /GroupByKey/WriteStream into CoGroupByKey /pair_with_pc2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.445Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Split into Read pc1/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.476Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Split into Read pc2/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.500Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/AddRandomKeys into Read pc2/Split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.528Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Read pc2/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.552Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Read pc2/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.577Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Read pc2/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.602Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Read pc2/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.624Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/Reshuffle/RemoveRandomKeys into Read pc2/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.645Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc2/ReadSplits into Read pc2/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start pc2 into Read pc2/ReadSplits
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.695Z: JOB_MESSAGE_DETAILED: Fusing consumer CoGroupByKey /pair_with_pc2 into Measure time: Start pc2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.718Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/AddRandomKeys into Read pc1/Split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.734Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Read pc1/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.757Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Read pc1/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.784Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Read pc1/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Read pc1/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/Reshuffle/RemoveRandomKeys into Read pc1/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Read pc1/ReadSplits into Read pc1/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start pc1 into Read pc1/ReadSplits
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.896Z: JOB_MESSAGE_DETAILED: Fusing consumer CoGroupByKey /pair_with_pc1 into Measure time: Start pc1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.916Z: JOB_MESSAGE_DETAILED: Fusing consumer CoGroupByKey /GroupByKey/MergeBuckets into CoGroupByKey /GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.942Z: JOB_MESSAGE_DETAILED: Fusing consumer CoGroupByKey /Map(_merge_tagged_vals_under_key) into CoGroupByKey /GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.966Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume Joined Collections into CoGroupByKey /Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:31.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End into Consume Joined Collections
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:32.022Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:32.049Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:32.071Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:32.085Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-06_08_39_25-414800489272855928 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:33.218Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:33.236Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:39:33.260Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:04.568Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:04.596Z: JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:05.106Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:14.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:35.076Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T16:40:35.105Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T18:39:21.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.493Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2020-12-06_08_39_25-414800489272855928.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.585Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.655Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.675Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.697Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-06_08_39_25-414800489272855928 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-06T20:00:29.722Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/co_group_by_key_test.py",> line 146, in <module>
    CoGroupByKeyTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1675, in wait_until_finish
    'Job did not reach to a terminal state after waiting indefinitely.')
AssertionError: Job did not reach to a terminal state after waiting indefinitely.

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 23m 11s
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/756bqwjzo5xz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_CoGBK_Dataflow_Streaming #110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_CoGBK_Dataflow_Streaming/110/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org