You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/28 03:23:41 UTC

Build failed in Jenkins: beam_python_mongoio_load_test #3771

See <https://ci-beam.apache.org/job/beam_python_mongoio_load_test/3771/display/redirect?page=changes>

Changes:

[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read

[Valentyn Tymofieiev] Restrict google-api-core

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Replace distutils with supported modules. (#22456)

[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)

[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.

[noreply] Remove stripping of step name. Replace removing only suffix step name


------------------------------------------
[...truncated 130.58 KB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:29.124Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-27_20_16_27-13381159890760975027.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:31.069Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:31.822Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:31.855Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:31.961Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:31.997Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.136Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.191Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/KeyWithVoid into Map
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/KeyWithVoid into Map
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Map into ReadFromMongoDB/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into Count/CombineGlobally(CountCombineFn)/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.380Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.491Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/UnKey into Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.519Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/CombinePerKey/GroupByKey+Combine/CombinePerKey/Combine/Partial into Combine/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.553Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/CombinePerKey/GroupByKey/Reify into Combine/CombinePerKey/GroupByKey+Combine/CombinePerKey/Combine/Partial
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.580Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/CombinePerKey/GroupByKey/Write into Combine/CombinePerKey/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/CombinePerKey/Combine into Combine/CombinePerKey/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.633Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/CombinePerKey/Combine/Extract into Combine/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.654Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/UnKey into Combine/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.685Z: JOB_MESSAGE_DETAILED: Unzipping flatten s23 for input s21.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.722Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten, into producer assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.747Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.784Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.809Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/RestoreTags into assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.844Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/RestoreTags
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.880Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.906Z: JOB_MESSAGE_DETAILED: Unzipping flatten s23-u49 for input s24-reify-value27-c47
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.940Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:32.972Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify into assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.005Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.029Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s16.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.061Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/WindowInto(WindowIntoFn), through flatten Flatten, into producer Count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.087Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.130Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.159Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17-u56 for input s19.None-c54
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.224Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/ToVoidKey, through flatten Flatten/Unzipped-1, into producer assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.255Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Combine/InjectDefault/InjectDefault
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.278Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into Count/CombineGlobally(CountCombineFn)/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.343Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine/InjectDefault/InjectDefault into Combine/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.411Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.468Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.504Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.530Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.772Z: JOB_MESSAGE_DEBUG: Executing wait step start74
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.852Z: JOB_MESSAGE_BASIC: Executing operation Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.890Z: JOB_MESSAGE_BASIC: Executing operation Combine/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.909Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:33.938Z: JOB_MESSAGE_BASIC: Starting 5 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:34.471Z: JOB_MESSAGE_BASIC: Finished operation Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:34.474Z: JOB_MESSAGE_BASIC: Finished operation Combine/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:34.544Z: JOB_MESSAGE_DEBUG: Value "Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:34.594Z: JOB_MESSAGE_DEBUG: Value "Combine/CombinePerKey/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:16:34.674Z: JOB_MESSAGE_BASIC: Executing operation ReadFromMongoDB/Read+Map+Count/CombineGlobally(CountCombineFn)/KeyWithVoid+Combine/KeyWithVoid+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Combine/CombinePerKey/GroupByKey+Combine/CombinePerKey/Combine/Partial+Combine/CombinePerKey/GroupByKey/Reify+Combine/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-07-27_20_16_27-13381159890760975027 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:17:01.989Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:17:13.166Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:17:39.028Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:20:05.428Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:20:05.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:20:10.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Resizing worker pool from 5 to 1.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:14.926Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 556, in _get_head_document_id
    return cursor[0]["_id"]
  File "/usr/local/lib/python3.7/site-packages/pymongo/cursor.py", line 694, in __getitem__
    raise IndexError("no such item for Cursor instance")
IndexError: no such item for Cursor instance

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 646, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 255, in execute
    self._split_task)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 300, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 306, in split
    start_position, stop_position
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 564, in _replace_none_positions
    start_position = self._get_head_document_id(ASCENDING)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 559, in _get_head_document_id
    raise ValueError("Empty Mongodb collection")
ValueError: Empty Mongodb collection

INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:17.020Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 556, in _get_head_document_id
    return cursor[0]["_id"]
  File "/usr/local/lib/python3.7/site-packages/pymongo/cursor.py", line 694, in __getitem__
    raise IndexError("no such item for Cursor instance")
IndexError: no such item for Cursor instance

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 646, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 255, in execute
    self._split_task)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 300, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 306, in split
    start_position, stop_position
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 564, in _replace_none_positions
    start_position = self._get_head_document_id(ASCENDING)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 559, in _get_head_document_id
    raise ValueError("Empty Mongodb collection")
ValueError: Empty Mongodb collection

INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:19.106Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 556, in _get_head_document_id
    return cursor[0]["_id"]
  File "/usr/local/lib/python3.7/site-packages/pymongo/cursor.py", line 694, in __getitem__
    raise IndexError("no such item for Cursor instance")
IndexError: no such item for Cursor instance

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 646, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 255, in execute
    self._split_task)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 300, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 306, in split
    start_position, stop_position
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 564, in _replace_none_positions
    start_position = self._get_head_document_id(ASCENDING)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 559, in _get_head_document_id
    raise ValueError("Empty Mongodb collection")
ValueError: Empty Mongodb collection

INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.190Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 556, in _get_head_document_id
    return cursor[0]["_id"]
  File "/usr/local/lib/python3.7/site-packages/pymongo/cursor.py", line 694, in __getitem__
    raise IndexError("no such item for Cursor instance")
IndexError: no such item for Cursor instance

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 646, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 255, in execute
    self._split_task)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 300, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 306, in split
    start_position, stop_position
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 564, in _replace_none_positions
    start_position = self._get_head_document_id(ASCENDING)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 559, in _get_head_document_id
    raise ValueError("Empty Mongodb collection")
ValueError: Empty Mongodb collection

INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.222Z: JOB_MESSAGE_BASIC: Finished operation ReadFromMongoDB/Read+Map+Count/CombineGlobally(CountCombineFn)/KeyWithVoid+Combine/KeyWithVoid+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Combine/CombinePerKey/GroupByKey+Combine/CombinePerKey/Combine/Partial+Combine/CombinePerKey/GroupByKey/Reify+Combine/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.313Z: JOB_MESSAGE_DEBUG: Executing failure step failure73
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.363Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S03:ReadFromMongoDB/Read+Map+Count/CombineGlobally(CountCombineFn)/KeyWithVoid+Combine/KeyWithVoid+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Combine/CombinePerKey/GroupByKey+Combine/CombinePerKey/Combine/Partial+Combine/CombinePerKey/GroupByKey/Reify+Combine/CombinePerKey/GroupByKey/Write failed., Internal Issue (112d683cf2159e30): 63963027:24514
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.513Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.598Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:21.632Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:57.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:57.533Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-28T03:22:57.561Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2022-07-27_20_16_27-13381159890760975027 is in state JOB_STATE_FAILED
ERROR:apache_beam.runners.dataflow.dataflow_runner:Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2022-07-27_20_16_27-13381159890760975027?project=<ProjectId>
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apache_beam/io/mongodbio_it_test.py",> line 170, in <module>
    run()
  File "<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apache_beam/io/mongodbio_it_test.py",> line 157, in run
    assert_that(r, equal_to([expected['number_sum'], expected['docs_count']]))
  File "<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apache_beam/pipeline.py",> line 597, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apache_beam/testing/test_pipeline.py",> line 116, in run
    state = result.wait_until_finish()
  File "<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apache_beam/runners/dataflow/dataflow_runner.py",> line 1676, in wait_until_finish
    self)
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 556, in _get_head_document_id
    return cursor[0]["_id"]
  File "/usr/local/lib/python3.7/site-packages/pymongo/cursor.py", line 694, in __getitem__
    raise IndexError("no such item for Cursor instance")
IndexError: no such item for Cursor instance

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 646, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 255, in execute
    self._split_task)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 263, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 300, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 306, in split
    start_position, stop_position
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 564, in _replace_none_positions
    start_position = self._get_head_document_id(ASCENDING)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/mongodbio.py", line 559, in _get_head_document_id
    raise ValueError("Empty Mongodb collection")
ValueError: Empty Mongodb collection


> Task :sdks:python:test-suites:dataflow:py37:mongodbioIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 316

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:mongodbioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 39m 39s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/k2gxewdxegjug

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_python_mongoio_load_test #3772

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_python_mongoio_load_test/3772/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org