You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/16 13:08:42 UTC

Build failed in Jenkins: beam_LoadTests_Python_GBK_Flink_Batch #65

See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/65/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7389] Add code examples for KvSwap page

[dcavazos] [BEAM-7389] Add code examples for Map page

[dcavazos] [BEAM-7389] Add code examples for Keys page

[dcavazos] [BEAM-7389] Add code examples for WithTimestamps page

[iemejia] Update build plugins

[mxm] [BEAM-7936] Update portable WordCount Gradle task on portability page

[markliu] Fix command format in Release Guide

[github] Update stager.py

[kedin] [SQL] Add custom table name resolution

[kedin] [SQL] Support complex identifiers in DataCatalog

[github] Downgrade log message level

[lukecwik] [BEAM-7987] Drop empty Windmill workitem in WindowingWindmillReader

------------------------------------------
[...truncated 92.66 KB...]
Collecting configparser>=3.5; python_version < "3" (from importlib-metadata>=0.12->pluggy<1,>=0.3.0->tox==3.11.1)
  Using cached https://files.pythonhosted.org/packages/ab/1a/ec151e5e703ac80041eaccef923611bbcec2b667c20383655a06962732e9/configparser-3.8.1-py2.py3-none-any.whl
Collecting scandir; python_version < "3.5" (from pathlib2; python_version == "3.4.*" or python_version < "3"->importlib-metadata>=0.12->pluggy<1,>=0.3.0->tox==3.11.1)
Installing collected packages: six, contextlib2, zipp, scandir, pathlib2, configparser, importlib-metadata, pluggy, toml, virtualenv, py, filelock, tox, futures, enum34, grpcio, protobuf, grpcio-tools
Successfully installed configparser-3.8.1 contextlib2-0.5.5 enum34-1.1.6 filelock-3.0.12 futures-3.3.0 grpcio-1.23.0 grpcio-tools-1.3.5 importlib-metadata-0.19 pathlib2-2.3.4 pluggy-0.12.0 protobuf-3.9.1 py-1.8.0 scandir-1.10.0 six-1.12.0 toml-0.10.0 tox-3.11.1 virtualenv-16.7.3 zipp-0.5.2

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.16.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.16.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/e3/5956c75f68906b119191ef30d9acff661b422cf918a29a03ee0c3ba774be/fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl
Collecting future<1.0.0,>=0.16.0 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.16.0.dev0) (1.23.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.16.0.dev0)
Collecting httplib2<=0.12.0,>=0.8 (from apache-beam==2.16.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting pymongo<4.0.0,>=3.8.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/5c/5379d5b8167a5938918d9ee147f865f6f8a64b93947d402cfdca5c1416d2/pymongo-3.9.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.16.0.dev0) (3.9.1)
Collecting pydot<2,>=1.2.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting pytz>=2018.3 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/87/76/46d697698a143e05f77bec5a526bf4e56a0be61d63425b68f4ba553b51f2/pytz-2019.2-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.16.0.dev0)
Collecting avro<2.0.0,>=1.8.1 (from apache-beam==2.16.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from apache-beam==2.16.0.dev0) (3.3.0)
Collecting pyvcf<0.7.0,>=0.6.8 (from apache-beam==2.16.0.dev0)
Collecting typing<3.7.0,>=3.6.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/cc/3e/29f92b7aeda5b078c86d14f550bf85cff809042e3429ace7af6193c3bc9f/typing-3.6.6-py2-none-any.whl
Collecting pyarrow<0.15.0,>=0.11.1 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/24/54/3c6225f1ca70351338075af3a3aa3119f2f6c8175989b62eb759cc4a9e5b/pyarrow-0.14.1-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting cachetools<4,>=3.1.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/2f/a6/30b0a0bef12283e83e58c1d6e7b5aabc7acfc4110df81a4471655d33e704/cachetools-3.1.1-py2.py3-none-any.whl
Collecting google-apitools<0.5.29,>=0.5.28 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/07/5e/3e04cb66f5ced9267a854184bb09863d85d199646ea8480fee26b4313a00/google_apitools-0.5.28-py2-none-any.whl
Collecting google-cloud-datastore<1.8.0,>=1.7.1 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d0/aa/29cbcf8cf7d08ce2d55b9dce858f7c632b434cb6451bed17cb4275804217/google_cloud_datastore-1.7.4-py2.py3-none-any.whl
Collecting google-cloud-pubsub<0.40.0,>=0.39.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c0/9a/4455b1c1450e9b912855b58ca6eee7a27ff1e9b52e4d98c243d93256f469/google_cloud_pubsub-0.39.1-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.18.0,>=1.6.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/d7/72/e88edd9a0b3c16a7b2c4107b1a9d3ff182b84a29f051ae15293e1375d7fe/google_cloud_bigquery-1.17.0-py2.py3-none-any.whl
Collecting google-cloud-core<2,>=0.28.1 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ee/f0/084f598629db8e6ec3627688723875cdb03637acb6d86999bb105a71df64/google_cloud_core-1.0.3-py2.py3-none-any.whl
Collecting google-cloud-bigtable<0.33.0,>=0.31.1 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/08/77/b468e209dbb0a6f614e6781f06a4894299a4c6167c2c525cc086caa7c075/google_cloud_bigtable-0.32.2-py2.py3-none-any.whl
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.16.0.dev0)
Collecting googledatastore<7.1,>=7.0.1 (from apache-beam==2.16.0.dev0)
Collecting nose>=1.3.7 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
Collecting nose_xunitmp>=0.4.1 (from apache-beam==2.16.0.dev0)
Collecting numpy<2,>=1.14.3 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1f/c7/198496417c9c2f6226616cff7dedf2115a4f4d0276613bab842ec8ac1e23/numpy-1.16.4-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/e3/f52d484244105fa3d558ce8217a5190cd3d40536076bef66d92d01566325/pandas-0.23.4-cp27-cp27mu-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/1e/a1/be8c8610f4620c56790965ba2b564dd76d13cbcd7c2ff8f6053ce63027fb/tenacity-5.1.1-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.8->apache-beam==2.16.0.dev0) (1.12.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from grpcio<2,>=1.8->apache-beam==2.16.0.dev0) (1.1.6)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/f9/d8/bd657bfa0e89eb71ad5e977ed99a9bb2b44e5db68d9190970637c26501bb/pbr-5.4.2-py2.py3-none-any.whl
Collecting funcsigs>=1; python_version < "3.3" (from mock<3.0.0,>=1.0.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6a/6e/209351ec34b7d7807342e2bb6ff8a96eef1fd5dcac13bdbadf065c2bb55c/pyasn1-0.4.6-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/be/70/e5ea8afd6d08a4b99ebfc77bd1845248d56cfcf43d11f9dc324b9580a35c/pyasn1_modules-0.2.6-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.16.0.dev0) (41.1.0)
Collecting pyparsing>=2.1.4 (from pydot<2,>=1.2.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/11/fa/0160cd525c62d7abd076a070ff02b2b94de589f1a9789774f17d7c54058e/pyparsing-2.4.2-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.29,>=0.5.28->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 (from google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/e5/7059475b3013a3c75abe35015c5761735ab224eb1b129fee7c8e376e7805/google_api_core-1.14.2-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub<0.40.0,>=0.39.0->apache-beam==2.16.0.dev0)
Collecting google-resumable-media>=0.3.1 (from google-cloud-bigquery<1.18.0,>=1.6.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.16.0.dev0)
Collecting monotonic>=0.6; python_version == "2.7" (from tenacity<6.0,>=5.0.2->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.6.0->google-cloud-datastore<1.8.0,>=1.7.1->apache-beam==2.16.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, docopt, urllib3, certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, pyyaml, avro, pyvcf, typing, numpy, pyarrow, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, proto-google-cloud-datastore-v1, googledatastore, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-1.9.0 cachetools-3.1.1 certifi-2019.6.16 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 funcsigs-1.0.2 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.0 google-cloud-bigtable-0.32.2 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-0.39.1 google-resumable-media-0.3.2 googleapis-common-protos-1.6.0 googledatastore-7.0.2 grpc-google-iam-v1-0.11.4 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.4 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.4.2 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.14.1 pyasn1-0.4.6 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyvcf-0.6.8 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.1.1 typing-3.6.6 urllib3-1.25.3

> Task :sdks:python:apache_beam:testing:load_tests:run
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/build/gradleenv/1329484227/local/lib/python2.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest) ... ERROR

======================================================================
ERROR: testGroupByKey (apache_beam.testing.load_tests.group_by_key_test.GroupByKeyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 70, in tearDown
    result = self.pipeline.run()
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 109, in run
    state = result.wait_until_finish()
  File "<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 455, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline load_tests_Python_Flink_Batch_GBK_4_0816100248_7323fc3d-4be0-4abb-ba12-6e42b905cd2a failed in state FAILED: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots within timeout of 300000 ms to run the job. Please make sure that the cluster has enough resources.
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: INFO: Metrics will not be collected
root: INFO: ==================== <function lift_combiners at 0x7f8e2ac6a320> ====================
root: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n  Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 1_23\n  GroupByKey 1:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 1_27\n  Ungroup 1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 1_28\n  Measure time: End 1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 2_29\n  GroupByKey 2:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 2_33\n  Ungroup 2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 2_34\n  Measure time: End 2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 3_35\n  GroupByKey 3:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 3_39\n  Ungroup 3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 3_40\n  Measure time: End 3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: INFO: ==================== <function expand_sdf at 0x7f8e2ac6a398> ====================
root: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_Read/Impulse_3\n  Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Split_4\n  Read/Split:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/AddRandomKeys_6\n  Read/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_8\n  Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/GroupByKey_9\n  Read/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_13\n  Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/Reshuffle/RemoveRandomKeys_14\n  Read/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Read/ReadSplits_15\n  Read/ReadSplits:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: Start_16\n  Measure time: Start:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 0_17\n  GroupByKey 0:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 0_21\n  Ungroup 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 0_22\n  Measure time: End 0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 1_23\n  GroupByKey 1:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 1_27\n  Ungroup 1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 1_28\n  Measure time: End 1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 2_29\n  GroupByKey 2:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 2_33\n  Ungroup 2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 2_34\n  Measure time: End 2:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_GroupByKey 3_35\n  GroupByKey 3:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Ungroup 3_39\n  Ungroup 3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Measure time: End 3_40\n  Measure time: End 3:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
root: DEBUG: Runner option 'dataflow_kms_key' was already added
root: DEBUG: Runner option 'enable_streaming_engine' was already added
root: DEBUG: Runner option 'project' was already added
root: DEBUG: Runner option 'zone' was already added
root: DEBUG: Runner option 'job_name' was already added
root: DEBUG: Runner option 'runner' was already added
root: DEBUG: Runner option 'temp_location' was already added
root: DEBUG: Runner option 'experiments' was already added
root: DEBUG: Runner option 'streaming' was already added
root: DEBUG: Runner option 'environment_cache_millis' was already added
root: DEBUG: Runner option 'job_endpoint' was already added
root: DEBUG: Runner option 'sdk_worker_parallelism' was already added
root: DEBUG: Runner option 'files_to_stage' was already added
root: WARNING: Discarding unparseable args: ['--publish_to_big_query=false', '--metrics_dataset=load_test', '--metrics_table=python_flink_batch_GBK_4', '--input_options={"num_records": 5000000,"key_size": 10,"value_size":90}', '--iterations=1', '--fanout=4']
root: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 600af72fdd5f6e9bb5f5c567b0975b6b)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:487)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:475)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:450)
	at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:210)
	at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:187)
	at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
	at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:200)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:92)
	at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:68)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:78)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
	at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
	... 16 more
Caused by: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots within timeout of 300000 ms to run the job. Please make sure that the cluster has enough resources.
	at org.apache.flink.runtime.executiongraph.Execution.lambda$scheduleForExecution$1(Execution.java:435)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
	at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
	at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:772)
	at akka.dispatch.OnComplete.internal(Future.scala:258)
	at akka.dispatch.OnComplete.internal(Future.scala:256)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:186)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:183)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
	at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:83)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
	at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:603)
	at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:126)
	at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
	at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109)
	at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
	at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:329)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:280)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:284)
	at akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:236)
	... 1 more

root: ERROR: org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate enough slots within timeout of 300000 ms to run the job. Please make sure that the cluster has enough resources.
root: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 307.865s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 38s
3 actionable tasks: 3 executed

Publishing build scan...
https://gradle.com/s/pc2xbam7cizz4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_GBK_Flink_Batch #66

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/66/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org