You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/05 21:53:41 UTC

Build failed in Jenkins: beam_python_mongoio_load_test #2

See <https://builds.apache.org/job/beam_python_mongoio_load_test/2/display/redirect>

Changes:


------------------------------------------
[...truncated 37.68 KB...]
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp35-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached https://files.pythonhosted.org/packages/ac/7d/e63a1ba78326e42a69bda88b1fcfca22ddd773c4cc51ae85b3b869abcff2/fastavro-0.21.24-cp35-cp35m-manylinux1_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/8b/99/a0/81daf51dcd359a9377b110a8a886b3895921802d2fc1b2397e/future-0.18.2-cp35-none-any.whl
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.19.0.dev0) (1.25.0)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp35-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp35-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting numpy<2,>=1.14.3
  Using cached https://files.pythonhosted.org/packages/ab/e9/2561dbfbc05146bffa02167e09b9902e273decb2dc4cd5c43314ede20312/numpy-1.17.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting pymongo<4.0.0,>=3.8.0
  Using cached https://files.pythonhosted.org/packages/fe/96/3f43c48b2801e5cefe893421d67640cdc2b7cd940a51790b5c2062fb044e/pymongo-3.9.0-cp35-cp35m-manylinux1_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp35-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from apache-beam==2.19.0.dev0) (3.11.1)
Collecting pydot<2,>=1.2.0
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0
  Using cached https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl
Collecting pytz>=2018.3
  Using cached https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached https://files.pythonhosted.org/packages/dc/63/96fa7d9fb23eb07fd9f0eae2fb91b74193a536596eaf2712ca18aaa8f447/pyarrow-0.15.1-cp35-cp35m-manylinux2010_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/94/54/6f/a5df680fd3224aa45145686f3b1b02a878a90ea769fcf9daaf/avro_python3-1.9.1-cp35-none-any.whl
Collecting cachetools<4,>=3.1.0
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/d6/c2/92/837e8a4d649a209dff85b38d7fbb576b4b480738be70865f29/google_apitools-0.5.28-cp35-none-any.whl
Collecting google-cloud-datastore<1.8.0,>=1.7.1
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<1.1.0,>=0.39.0
  Using cached https://files.pythonhosted.org/packages/d3/91/07a82945a7396ea34debafd476724bb5fc267c292790fdf2138c693f95c5/google_cloud_pubsub-1.0.2-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0
  Using cached https://files.pythonhosted.org/packages/a4/96/1b9cf1d43869c47a205aad411dac7c3040df6093d63c39273fa4d4c45da7/google_cloud_bigquery-1.17.1-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<1.1.0,>=0.31.1
  Using cached https://files.pythonhosted.org/packages/95/af/0ef7d097a1d5ad0c843867600e86de915e8ab8864740f49a4636cfb51af6/google_cloud_bigtable-1.0.0-py2.py3-none-any.whl
Collecting nose>=1.3.7
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp35-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached https://files.pythonhosted.org/packages/74/24/0cdbf8907e1e3bc5a8da03345c23cbed7044330bb8f73bb12e711a640a00/pandas-0.24.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/54/b7/c7/2ada654ee54483c9329871665aaf4a6056c3ce36f29cf66e67/PyYAML-5.2-cp35-cp35m-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
  Using cached https://files.pythonhosted.org/packages/8c/f1/66c54a412543b29454102ae74b1454fce2d307b1c36e6bd2e9818394df88/requests_mock-1.7.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached https://files.pythonhosted.org/packages/45/67/67bb1db087678bc5c6f20766cf18914dfe37b0b9d4e4c5bb87408460b75f/tenacity-5.1.5-py2.py3-none-any.whl
Collecting pytest<5.0,>=4.4.0
  Using cached https://files.pythonhosted.org/packages/64/f1/187a98b8f913a8f3a53d213cca2fae19718565f36165804d7f4f91fe5b76/pytest-4.6.6-py2.py3-none-any.whl
Collecting pytest-xdist<2,>=1.29.0
  Using cached https://files.pythonhosted.org/packages/f7/80/2af1fc039f779f61c7207dc9f79a1479874e7795f869fddaf135efde1cd4/pytest_xdist-1.30.0-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.19.0.dev0) (1.13.0)
Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting requests>=2.7.0
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11
  Using cached https://files.pythonhosted.org/packages/7a/db/a968fd7beb9fe06901c1841cb25c9ccb666ca1b9a19b114d1bbedf1126fc/pbr-5.4.4-py2.py3-none-any.whl
Collecting rsa>=3.1.4
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7
  Using cached https://files.pythonhosted.org/packages/62/1e/a94a8d635fa3ce4cfc7f506003548d0a2447ae76fd5ca53932970fe3053f/pyasn1-0.4.8-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5
  Using cached https://files.pythonhosted.org/packages/52/50/bb4cefca37da63a0c52218ba2cb1b1c36110d84dcbae8aa48cd67c5e95c2/pyasn1_modules-0.2.7-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.19.0.dev0) (42.0.2)
Collecting pyparsing>=2.1.4
  Using cached https://files.pythonhosted.org/packages/c0/0c/fc2e007d9a992d997f04a80125b0f183da7fb554f1de701bbb70a8e7d479/pyparsing-2.4.5-py2.py3-none-any.whl
Collecting fasteners>=0.14
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
  Using cached https://files.pythonhosted.org/packages/29/3a/c528ef37f48d6ffba16f0f3c0426456ba21e0dd32be9c61a2ade93e07faa/google_api_core-1.14.3-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/de/3a/83/77a1e18e1a8757186df834b86ce6800120ac9c79cd8ca4091b/grpc_google_iam_v1-0.12.3-cp35-none-any.whl
Collecting google-resumable-media<0.5.0dev,>=0.3.1
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting atomicwrites>=1.0
  Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.2.0)
Collecting wcwidth
  Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Collecting attrs>=17.4.0
  Using cached https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0)
Collecting pathlib2>=2.2.0; python_version < "3.6"
  Using cached https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl
Collecting packaging
  Using cached https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl
Requirement already satisfied: more-itertools>=4.0.0; python_version > "2.7" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (8.0.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Collecting pytest-forked
  Using cached https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17
  Using cached https://files.pythonhosted.org/packages/b9/63/df50cac98ea0d5b006c55a399c3bf1db9da7b5a24de7890bc9cfd5dd9e99/certifi-2019.11.28-py2.py3-none-any.whl
Collecting monotonic>=0.1
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0
  Using cached https://files.pythonhosted.org/packages/7b/cb/786dc53d93494784935a62947643b48250b84a882474e714f9af5e1a1928/google_auth-1.7.1-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/9e/3d/a2/1bec8bb7db80ab3216dbc33092bb7ccd0debfb8ba42b5668d5/googleapis_common_protos-1.6.0-cp35-none-any.whl
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Collecting apipkg>=1.4
  Using cached https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, docopt, idna, chardet, urllib3, certifi, requests, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, pyarrow, avro-python3, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, atomicwrites, wcwidth, attrs, pathlib2, packaging, pytest, pytest-forked, apipkg, execnet, pytest-xdist, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.11.28 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 fasteners-0.15 future-0.18.2 google-api-core-1.14.3 google-apitools-0.5.28 google-auth-1.7.1 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.4 oauth2client-3.0.0 packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pathlib2-2.3.5 pbr-5.4.4 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.6 pytest-forked-1.1.3 pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyyaml-5.2 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 urllib3-1.25.7 wcwidth-0.1.7

> Task :sdks:python:test-suites:dataflow:py35:mongodbioIT
--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:72: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1205214958-796432.1575582598.798017/pipeline.pb...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1205214958-796432.1575582598.798017/pipeline.pb in 6 seconds.
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1205214958-796432.1575582598.798017/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1205214958-796432.1575582598.798017/dataflow_python_sdk.tar in 0 seconds.
WARNING:apache_beam.utils.retry:Retry with exponential backoff: waiting for 4.661199472902707 seconds before retrying submit_job_description because we caught exception: BrokenPipeError: [Errno 32] Broken pipe
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 598, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 629, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/oauth2client/transport.py",> line 169, in new_request
    redirections, connection_type)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1924, in request
    cachekey,
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1595, in _request
    conn, request_uri, method, body, headers
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/httplib2/__init__.py",> line 1502, in _conn_request
    conn.request(method, request_uri, body, headers)
  File "/usr/lib/python3.5/http/client.py", line 1122, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.5/http/client.py", line 1167, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.5/http/client.py", line 1118, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.5/http/client.py", line 946, in _send_output
    self.send(message_body)
  File "/usr/lib/python3.5/http/client.py", line 918, in send
    self.sock.sendall(data)
  File "/usr/lib/python3.5/ssl.py", line 891, in sendall
    v = self.send(data[count:])
  File "/usr/lib/python3.5/ssl.py", line 861, in send
    return self._sslobj.write(data)
  File "/usr/lib/python3.5/ssl.py", line 586, in write
    return self._sslobj.write(data)

Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 100, in <module>
    run()
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 72, in run
    known_args.batch_size)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 499, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 551, in create_job
    return self.submit_job_description(job)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 598, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 629, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'connection': 'close', 'status': '413', 'referrer-policy': 'no-referrer', 'content-type': 'text/html; charset=UTF-8', 'content-length': '2393', 'date': 'Thu, 05 Dec 2019 21:53:19 GMT'}>, content <<!DOCTYPE html>
<html lang=en>
  <meta charset=utf-8>
  <meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">
  <title>Error 413 (Request Entity Too Large)!!1</title>
  <style>
    *{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}
  </style>
  <a href=//www.google.com/><span id=logo aria-label=Google></span></a>
  <p><b>413.</b> <ins>That���s an error.</ins>
  <p>Your client issued a request that was too large.
 <script>
  (function() { var c=function(a,d,b){a=a+"=deleted; path="+d;null!=b&&(a+="; domain="+b);document.cookie=a+"; expires=Thu, 01 Jan 1970 00:00:00 GMT"};var g=function(a){var d=e,b=location.hostname;c(d,a,null);c(d,a,b);for(var f=0;;){f=b.indexOf(".",f+1);if(0>f)break;c(d,a,b.substring(f+1))}};var h;if(4E3<unescape(encodeURI(document.cookie)).length){for(var k=document.cookie.split(";"),l=[],m=0;m<k.length;m++){var n=k[m].match(/^\s*([^=]+)/);n&&l.push(n[1])}for(var p=0;p<l.length;p++){var e=l[p];g("/");for(var q=location.pathname,r=0;;){r=q.indexOf("/",r+1);if(0>r)break;var t=q.substring(0,r);g(t);g(t+"/")}"/"!=q.charAt(q.length-1)&&(g(q),g(q+"/"))}h=!0}else h=!1;
h&&setTimeout(function(){if(history.replaceState){var a=location.href;history.replaceState(null,"","/");location.replace(a)}},1E3); })();

</script>
 <ins>That���s all we know.</ins>
>

> Task :sdks:python:test-suites:dataflow:py35:mongodbioIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 115

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:mongodbioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 8s
5 actionable tasks: 5 executed

Publishing build scan...
https://scans.gradle.com/s/7n4jpiliwlpfq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_python_mongoio_load_test #5

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_python_mongoio_load_test/5/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_python_mongoio_load_test #4

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_python_mongoio_load_test/4/display/redirect>

Changes:


------------------------------------------
[...truncated 45.25 KB...]
Collecting fasteners>=0.14
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
  Using cached https://files.pythonhosted.org/packages/29/3a/c528ef37f48d6ffba16f0f3c0426456ba21e0dd32be9c61a2ade93e07faa/google_api_core-1.14.3-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/de/3a/83/77a1e18e1a8757186df834b86ce6800120ac9c79cd8ca4091b/grpc_google_iam_v1-0.12.3-cp35-none-any.whl
Collecting google-resumable-media<0.5.0dev,>=0.3.1
  Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl
Collecting wcwidth
  Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Collecting pathlib2>=2.2.0; python_version < "3.6"
  Using cached https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.2.0)
Collecting atomicwrites>=1.0
  Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1)
Collecting packaging
  Using cached https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl
Requirement already satisfied: more-itertools>=4.0.0; python_version > "2.7" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (8.0.2)
Collecting attrs>=17.4.0
  Using cached https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl
Collecting pytest-forked
  Using cached https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17
  Using cached https://files.pythonhosted.org/packages/b9/63/df50cac98ea0d5b006c55a399c3bf1db9da7b5a24de7890bc9cfd5dd9e99/certifi-2019.11.28-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting monotonic>=0.1
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/9e/3d/a2/1bec8bb7db80ab3216dbc33092bb7ccd0debfb8ba42b5668d5/googleapis_common_protos-1.6.0-cp35-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0
  Using cached https://files.pythonhosted.org/packages/ec/11/1d90cbfa72a084b08498e8cea1fee199bc965cdac391d241f5ae6257073e/google_auth-1.7.2-py2.py3-none-any.whl
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0)
Collecting apipkg>=1.4
  Using cached https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, urllib3, certifi, chardet, idna, requests, docopt, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro-python3, pyarrow, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, wcwidth, pathlib2, atomicwrites, packaging, attrs, pytest, apipkg, execnet, pytest-forked, pytest-xdist, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.11.28 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 fasteners-0.15 future-0.18.2 google-api-core-1.14.3 google-apitools-0.5.28 google-auth-1.7.2 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.1.0 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.4 oauth2client-3.0.0 packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pathlib2-2.3.5 pbr-5.4.4 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.7 pytest-forked-1.1.3 pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyyaml-5.2 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 urllib3-1.25.7 wcwidth-0.1.7

> Task :sdks:python:test-suites:dataflow:py35:mongodbioIT
--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:79: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size))
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/pipeline.pb...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2019-12-06T21:20:40.467265Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-06_13_20_38-4196865902256457411'
 location: 'us-central1'
 name: 'beamapp-jenkins-1206212037-804024'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-06T21:20:40.467265Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2019-12-06_13_20_38-4196865902256457411]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_20_38-4196865902256457411?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:38.729Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-06_13_20_38-4196865902256457411. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:38.729Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-06_13_20_38-4196865902256457411.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:42.617Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.289Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.935Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.960Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.987Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.017Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.089Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.125Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.149Z: JOB_MESSAGE_DETAILED: Fusing consumer Create documents into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.173Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/ParDo(_GenerateObjectIdFn) into Create documents
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.197Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/AddRandomKeys into WriteToMongoDB/ParDo(_GenerateObjectIdFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.234Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into WriteToMongoDB/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.268Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify into WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.292Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.326Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.361Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.388Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/RemoveRandomKeys into WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.424Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/ParDo(_WriteMongoFn) into WriteToMongoDB/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.462Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.498Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.527Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.563Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.718Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.802Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.849Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.883Z: JOB_MESSAGE_BASIC: Starting 5 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.915Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:45.002Z: JOB_MESSAGE_DEBUG: Value "WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:45.076Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Create documents+WriteToMongoDB/ParDo(_GenerateObjectIdFn)+WriteToMongoDB/Reshuffle/AddRandomKeys+WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:21:10.627Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:21:13.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:22:41.112Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:22:41.143Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:24:19.896Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 1 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:24:24.928Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 5 to 1.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:26:44.686Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:32:44.686Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.283Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+Create documents+WriteToMongoDB/ParDo(_GenerateObjectIdFn)+WriteToMongoDB/Reshuffle/AddRandomKeys+WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.418Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.469Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.553Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+WriteToMongoDB/Reshuffle/RemoveRandomKeys+WriteToMongoDB/ParDo(_WriteMongoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:20.764Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 6.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:26.771Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:26.803Z: JOB_MESSAGE_DETAILED: Resized worker pool to 5, though goal was 6.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:32.222Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 6 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:38:44.689Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:42:51.688Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 6 to 7.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:42:57.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 7 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:43:56.052Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 7 to 8.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:07.266Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:44.688Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:55.967Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 8 to 9.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:01.698Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 9 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:54.070Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 9 to 10.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:59.919Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:24.452Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 10 to 14.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:30.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 13 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:30.358Z: JOB_MESSAGE_DETAILED: Resized worker pool to 13, though goal was 14.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:35.795Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 14 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:49:22.785Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 14 to 16.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:49:28.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 16 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:50:44.689Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:51:55.176Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 16 to 21.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:52:01.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 21 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:51.083Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 21 to 29.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:57.184Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 28 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:57.217Z: JOB_MESSAGE_DETAILED: Resized worker pool to 28, though goal was 29.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:54:02.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 29 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:53.913Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 29 to 33.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:59.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 32 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:59.803Z: JOB_MESSAGE_DETAILED: Resized worker pool to 32, though goal was 33.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:56:05.218Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 33 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:56:44.688Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:57:23.523Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 33 to 42.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:57:29.337Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 42 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:58:54.318Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 42 to 58.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:59:00.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 58 based on the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:00:25.086Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 58 to 66.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:00:30.963Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 66 based on the rate of progress in the currently running step(s).
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 2/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:02:44.691Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.204Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+WriteToMongoDB/Reshuffle/RemoveRandomKeys+WriteToMongoDB/ParDo(_WriteMongoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.323Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.447Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.501Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.532Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 66 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.796Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.865Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_DONE
INFO:__main__:Writing 10000000 documents to mongodb finished in 2835.091 seconds
WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1575667237
<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:94: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/pipeline.pb in 2 seconds.
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/dataflow_python_sdk.tar in 0 seconds.
Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 107, in <module>
    run()
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 96, in run
    r, equal_to([number for number in range(known_args.num_documents)]))
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 513, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 551, in create_job
    return self.submit_job_description(job)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 598, in submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 629, in Create
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpBadRequestError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'content-length': '229', 'x-frame-options': 'SAMEORIGIN', '-content-encoding': 'gzip', 'x-xss-protection': '0', 'vary': 'Origin, X-Origin, Referer', 'cache-control': 'private', 'server': 'ESF', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 06 Dec 2019 22:19:59 GMT', 'status': '400'}>, content <{
  "error": {
    "code": 400,
    "message": "(e17212cdce498236): The job graph is too large. Please try again with a smaller job graph, or split your job into two or more smaller jobs.",
    "status": "INVALID_ARGUMENT"
  }
}
>

> Task :sdks:python:test-suites:dataflow:py35:mongodbioIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 115

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:mongodbioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 14s
5 actionable tasks: 5 executed

Publishing build scan...
https://scans.gradle.com/s/iclrx53y73ayq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_python_mongoio_load_test #3

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_python_mongoio_load_test/3/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2ec3b0495c191597c9a88830d25a2c360b3277e0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2ec3b0495c191597c9a88830d25a2c360b3277e0
Commit message: "Merge pull request #10299 from [BEAM-8842] Reactivating BQIO Py test while preventing timeouts."
First time build. Skipping changelog.
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_python_mongoio_load_test] $ /bin/bash -xe /tmp/jenkins3572769796167941918.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3">

[EnvInject] - Variables injected successfully.
[beam_python_mongoio_load_test] $ /bin/bash -xe /tmp/jenkins1165162051713657137.sh
+ <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/kubernetes.sh> createNamespace beam-python-mongoio-load-test-3
+ KUBECONFIG='"<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"'>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=default'
+ createNamespace beam-python-mongoio-load-test-3
+ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> create namespace beam-python-mongoio-load-test-3'
++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> create namespace beam-python-mongoio-load-test-3
namespace/beam-python-mongoio-load-test-3 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-python-mongoio-load-test-3

[EnvInject] - Variables injected successfully.
[beam_python_mongoio_load_test] $ /bin/bash -xe /tmp/jenkins6460128924165062403.sh
+ <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
+ KUBECONFIG='"<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"'>
+ KUBERNETES_NAMESPACE=beam-python-mongoio-load-test-3
+ KUBECTL='kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3'
+ apply <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
+ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 apply -R -f <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml'>
++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 apply -R -f <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
service/mongo-load-balancer-service created
replicationcontroller/mongo created
[beam_python_mongoio_load_test] $ /bin/bash -xe /tmp/jenkins116822935063830087.sh
+ set -eo pipefail
+ eval <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP mongo-load-balancer-service
++ <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP mongo-load-balancer-service
+ sed 's/^/LOAD_BALANCER_IP=/'
+ KUBECONFIG='"<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"'>
+ KUBERNETES_NAMESPACE=beam-python-mongoio-load-test-3
+ KUBECTL='kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3'
+ loadBalancerIP mongo-load-balancer-service
+ local name=mongo-load-balancer-service
+ local 'command=kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 1 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 2 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 3 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 4 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig="<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3"> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/config-beam-python-mongoio-load-test-3> --namespace=beam-python-mongoio-load-test-3 get svc mongo-load-balancer-service '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=35.202.123.167
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 35.202.123.167 ]]
+ echo 35.202.123.167
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g "-Popts=--temp_location=gs://temp-storage-for-perf-tests/loadtests --project=apache-beam-testing --mongo_uri=mongodb://35.202.123.167:27017 --num_documents=10000000 --batch_size=10000 --runner=DataflowRunner --num_workers=5" :sdks:python:test-suites:dataflow:py35:mongodbioIT
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

FAILURE: Build failed with an exception.

* What went wrong:
Task 'mongodbioIT' not found in project ':sdks:python:test-suites:dataflow:py35'.

* Try:
Run gradlew tasks to get a list of available tasks. Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23s

Publishing build scan...
https://scans.gradle.com/s/yerptt3s2e6cs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org