You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/06/24 20:00:50 UTC

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1035

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1035/display/redirect>

Changes:


------------------------------------------
[...truncated 325.15 KB...]
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:28:51.598Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:29:06.977Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:29:22.346Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:29:37.624Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:30:03.236Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:30:33.983Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:31:04.748Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:31:35.525Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:32:06.202Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:32:36.877Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:33:07.649Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 629, in do_instruction
    return getattr(self, request_type)(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 660, in process_bundle
    bundle_processor = self.bundle_processor_cache.get(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/sdk_****.py", line 491, in get
    processor = bundle_processor.BundleProcessor(
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/runners/****/bundle_processor.py", line 840, in _verify_descriptor_created_in_a_compatible_env
    raise RuntimeError(
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.8_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T18:49:02.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T19:22:04.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-24T19:46:06.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-24_08_07_39-16753807616813747312 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1556, in wait_until_finish
    assert duration or terminated, (
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-24_08_07_39-16753807616813747312?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 24s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/ltjpohuajnqra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1037

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1037/display/redirect>

Changes:


------------------------------------------
[...truncated 26.73 KB...]
  Using cached google_cloud_spanner-3.36.0-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.3-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting google-cloud-aiplatform<2.0,>=1.26.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_aiplatform-1.26.1-py2.py3-none-any.whl (2.6 MB)
Collecting freezegun>=0.3.12 (from apache-beam==2.49.0.dev0)
  Using cached freezegun-1.2.2-py3-none-any.whl (17 kB)
Collecting joblib>=1.0.1 (from apache-beam==2.49.0.dev0)
  Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Collecting mock<6.0.0,>=1.0.1 (from apache-beam==2.49.0.dev0)
  Using cached mock-5.0.2-py3-none-any.whl (30 kB)
Collecting parameterized<0.10.0,>=0.7.1 (from apache-beam==2.49.0.dev0)
  Using cached parameterized-0.9.0-py2.py3-none-any.whl (20 kB)
Collecting pyhamcrest!=1.10.0,<3.0.0,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached pyhamcrest-2.0.4-py3-none-any.whl (52 kB)
Collecting pyyaml<7.0.0,>=3.12 (from apache-beam==2.49.0.dev0)
  Using cached PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (701 kB)
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.4.0-py3-none-any.whl (323 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.3.0rc1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.1 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.79.3-py3-none-any.whl (417 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.160 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.160-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.12.0.dev0-py3-none-any.whl (128 kB)
Requirement already satisfied: packaging>=14.3 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.8/site-packages (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0) (23.1)
Collecting google-cloud-storage<3.0.0dev,>=1.32.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_cloud_storage-2.9.0-py2.py3-none-any.whl (113 kB)
Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_cloud_resource_manager-1.10.1-py2.py3-none-any.whl (321 kB)
Collecting shapely<2.0.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached Shapely-1.8.5.post1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.1 MB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.19.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.56.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.1.0-py3-none-any.whl (102 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.8/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.2.0)
Requirement already satisfied: tomli>=1.0.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.8/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (195 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.5.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-3.0.0a1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (613 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (81 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.1.0-py3-none-any.whl (44 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.6.1-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0.dev0,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3224269 sha256=6519589f4160243f55a73f66f65f0fc7345514906454ffd103bb97f6ed651978
  Stored in directory: /home/jenkins/.cache/pip/wheels/67/34/b8/5adce605a0a3f10491be234729fa4854c6a127bc01194f13fc
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, typing-extensions, threadpoolctl, tenacity, sqlparse, six, shapely, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, attrs, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pytest, pymongo, pydot, pyasn1-modules, isodate, hypothesis, httplib2, grpcio-status, google-resumable-media, cffi, requests_mock, pytest-xdist, pytest-timeout, pandas, oauth2client, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-storage, google-cloud-spanner, google-cloud-resource-manager, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, google-cloud-aiplatform, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.1 azure-identity-1.14.0b1 azure-storage-blob-12.17.0b1 boto3-1.26.160 botocore-1.29.160 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.12.0.dev0 google-apitools-0.5.31 google-auth-2.20.0 google-auth-httplib2-0.1.0 google-cloud-aiplatform-1.26.1 google-cloud-bigquery-3.11.2 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.19.0 google-cloud-core-2.3.2 google-cloud-datastore-2.16.0 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.8.2 google-cloud-recommendations-ai-0.10.3 google-cloud-resource-manager-1.10.1 google-cloud-spanner-3.36.0 google-cloud-storage-2.9.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.3 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-3.0.0a1 grpc-google-iam-v1-0.12.6 grpcio-status-1.56.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.79.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.3 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.4.0 pymysql-1.1.0 pyparsing-3.1.0 pytest-7.4.0 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.3.0rc1 scipy-1.10.1 shapely-1.8.5.post1 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 typing-extensions-4.7.0rc1 urllib3-1.26.16 websocket-client-1.6.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0626125343.1687795257.612959/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0626125343.1687795257.612959/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0626125343.1687795257.612959/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0626125343.1687795257.612959/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230626160057614276-6487'
 createTime: '2023-06-26T16:00:59.442110Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-26_09_00_58-1706209663654876864'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0626125343'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-26T16:00:59.442110Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-26_09_00_58-1706209663654876864]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-26_09_00_58-1706209663654876864
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-26_09_00_58-1706209663654876864?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-26_09_00_58-1706209663654876864 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:04.361Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.531Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.753Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.817Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.865Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.903Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:10.958Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.021Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.061Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.118Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.149Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.182Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.204Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.256Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.278Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.339Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.522Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.540Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.564Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.585Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.605Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.766Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.794Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:11.815Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-26_09_00_58-1706209663654876864 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:31.398Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:01:57.544Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:02:33.489Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:02:44.290Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:25:23.993Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T16:58:21.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T17:01:36.780Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T17:38:23.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-26T17:40:29.058Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-6' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy146.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:321)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-6 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_Combine_Dataflow_Streaming #1040

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1040/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1039

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1039/display/redirect>

Changes:


------------------------------------------
[...truncated 29.25 KB...]
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.3-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting google-cloud-aiplatform<2.0,>=1.26.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_aiplatform-1.26.1-py2.py3-none-any.whl (2.6 MB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.162 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.162-py3-none-any.whl (11.0 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.12.0.dev0-py3-none-any.whl (128 kB)
Requirement already satisfied: packaging>=14.3 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0) (23.1)
Collecting google-cloud-storage<3.0.0dev,>=1.32.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_cloud_storage-2.10.0-py2.py3-none-any.whl (114 kB)
Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached google_cloud_resource_manager-1.10.1-py2.py3-none-any.whl (321 kB)
Collecting shapely<2.0.0 (from google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached Shapely-1.8.5.post1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.1 MB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.19.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.56.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.1.0-py3-none-any.whl (102 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.2.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (195 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.5.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-3.0.0a1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (613 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (81 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.1.0-py3-none-any.whl (44 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.6.1-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0.dev0,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform<2.0,>=1.26.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3228368 sha256=3361669bbf6be211b597dc241f16e64676c397c66757e259b446e6b7f1e2e22b
  Stored in directory: /home/jenkins/.cache/pip/wheels/67/34/b8/5adce605a0a3f10491be234729fa4854c6a127bc01194f13fc
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, typing-extensions, threadpoolctl, tenacity, sqlparse, six, shapely, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, attrs, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pytest, pymongo, pydot, pyasn1-modules, isodate, hypothesis, httplib2, grpcio-status, google-resumable-media, cffi, requests_mock, pytest-xdist, pytest-timeout, pandas, oauth2client, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-storage, google-cloud-spanner, google-cloud-resource-manager, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, google-cloud-aiplatform, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.1 azure-identity-1.14.0b1 azure-storage-blob-12.17.0b1 boto3-1.26.162 botocore-1.29.162 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.12.0.dev0 google-apitools-0.5.31 google-auth-2.22.0rc1 google-auth-httplib2-0.1.0 google-cloud-aiplatform-1.26.1 google-cloud-bigquery-3.11.3 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.19.0 google-cloud-core-2.3.2 google-cloud-datastore-2.16.0 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.8.2 google-cloud-recommendations-ai-0.10.3 google-cloud-resource-manager-1.10.1 google-cloud-spanner-3.36.0 google-cloud-storage-2.10.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.3 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-3.0.0a1 grpc-google-iam-v1-0.12.6 grpcio-status-1.56.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.80.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.3.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.3 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.4.0 pymysql-1.1.0 pyparsing-3.1.0 pytest-7.4.0 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.3.0rc1 scipy-1.10.1 shapely-1.8.5.post1 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 typing-extensions-4.7.0rc1 urllib3-1.26.16 websocket-client-1.6.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0628125350.1687964871.676535/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0628125350.1687964871.676535/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0628125350.1687964871.676535/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0628125350.1687964871.676535/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230628150751677478-1848'
 createTime: '2023-06-28T15:07:52.820188Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-28_08_07_52-335424733038044502'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0628125350'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-28T15:07:52.820188Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-28_08_07_52-335424733038044502]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-28_08_07_52-335424733038044502
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-28_08_07_52-335424733038044502?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-28_08_07_52-335424733038044502 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:57.852Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.255Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.286Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.357Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.427Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.456Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.513Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.582Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.618Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.647Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.675Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.698Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.729Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.754Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.786Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.807Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.897Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:07:59.963Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.069Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.094Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.129Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.170Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.201Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.390Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.413Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:00.482Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-28_08_07_52-335424733038044502 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:34.434Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:08:44.193Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:09:14.358Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:09:21.508Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:21:46.561Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:47:57.351Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T15:49:02.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T16:19:59.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T16:21:00.036Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T16:53:02.571Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T16:54:07.886Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T17:17:04.812Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T17:18:05.859Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T17:42:07.595Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T17:43:12.358Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T18:15:13.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T18:17:15.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T18:39:13.657Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T18:41:14.645Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T19:03:16.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T19:18:17.709Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T19:26:19.060Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T19:41:21.516Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T19:57:22.393Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-28_08_07_52-335424733038044502 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.481Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-28_08_07_52-335424733038044502.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.509Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.569Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.585Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.606Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-28T20:02:51.630Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1556, in wait_until_finish
    assert duration or terminated, (
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-28_08_07_52-335424733038044502?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 56m 59s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/kzgasvtcdfbpw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1038

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1038/display/redirect>

Changes:


------------------------------------------
[...truncated 47.27 KB...]
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 65.9 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.17-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 68.7 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.16-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 69.2 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.15-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 70.8 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.14-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 73.6 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.13-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 68.8 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.12-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 70.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.11-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 70.4 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.10-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 74.1 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.9-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 69.6 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.8-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 69.2 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.7-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 70.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.6-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 67.8 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.5-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 65.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.4-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 40.0 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.3-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 66.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.2-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 66.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.1-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 66.3 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 46.9 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.0b3-cp37-cp37m-manylinux2010_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 48.3 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.0b2-cp37-cp37m-manylinux2010_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 69.4 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.4.0b1-cp37-cp37m-manylinux2010_x86_64.whl (1.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 58.7 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.24-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 62.8 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.23-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 64.4 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.22-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 66.7 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.21-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 64.6 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.20-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 65.3 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.19-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 65.5 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.18-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 65.4 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.17-cp37-cp37m-manylinux2010_x86_64.whl (1.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 65.1 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.16-cp37-cp37m-manylinux2010_x86_64.whl (1.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 68.6 MB/s eta 0:00:00
  Downloading SQLAlchemy-1.3.15.tar.gz (6.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.1/6.1 MB 92.8 MB/s eta 0:00:00
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
  Downloading SQLAlchemy-1.3.14.tar.gz (6.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.1/6.1 MB 33.6 MB/s eta 0:00:00
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
  Downloading SQLAlchemy-1.3.13.tar.gz (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 83.2 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.12.tar.gz (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 87.1 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.11.tar.gz (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 70.3 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.10.tar.gz (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 84.9 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.9.tar.gz (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 65.2 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.8.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 58.2 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.7.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 79.0 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.6.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 79.8 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.5.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 78.7 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.4.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 82.3 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.3.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 84.5 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.2.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 54.3 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.1.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 84.3 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
  Downloading SQLAlchemy-1.3.0.tar.gz (5.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 77.3 MB/s eta 0:00:00
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Downloading scikit_learn-1.0.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (23.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.2/23.2 MB 57.5 MB/s eta 0:00:00
  Downloading scikit_learn-1.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (23.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.1/23.1 MB 53.4 MB/s eta 0:00:00
  Downloading scikit_learn-0.24.2-cp37-cp37m-manylinux2010_x86_64.whl (22.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 22.3/22.3 MB 56.0 MB/s eta 0:00:00
  Downloading scikit_learn-0.24.1-cp37-cp37m-manylinux2010_x86_64.whl (22.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 22.3/22.3 MB 50.5 MB/s eta 0:00:00
  Downloading scikit_learn-0.24.0-cp37-cp37m-manylinux2010_x86_64.whl (22.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 22.3/22.3 MB 56.5 MB/s eta 0:00:00
  Downloading scikit_learn-0.23.2-cp37-cp37m-manylinux1_x86_64.whl (6.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.8/6.8 MB 92.1 MB/s eta 0:00:00
  Downloading scikit_learn-0.23.1-cp37-cp37m-manylinux1_x86_64.whl (6.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.8/6.8 MB 88.9 MB/s eta 0:00:00
  Downloading scikit_learn-0.23.0-cp37-cp37m-manylinux1_x86_64.whl (7.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.3/7.3 MB 80.4 MB/s eta 0:00:00
  Downloading scikit_learn-0.22.2.post1-cp37-cp37m-manylinux1_x86_64.whl (7.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.1/7.1 MB 94.8 MB/s eta 0:00:00
  Downloading scikit_learn-0.22.2-cp37-cp37m-manylinux1_x86_64.whl (7.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.1/7.1 MB 92.4 MB/s eta 0:00:00
  Downloading scikit_learn-0.22.1-cp37-cp37m-manylinux1_x86_64.whl (7.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.0/7.0 MB 87.9 MB/s eta 0:00:00
  Downloading scikit_learn-0.22-cp37-cp37m-manylinux1_x86_64.whl (7.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.0/7.0 MB 85.9 MB/s eta 0:00:00
  Downloading scikit_learn-0.21.3-cp37-cp37m-manylinux1_x86_64.whl (6.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.7/6.7 MB 83.0 MB/s eta 0:00:00
  Downloading scikit_learn-0.21.2-cp37-cp37m-manylinux1_x86_64.whl (6.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.7/6.7 MB 85.8 MB/s eta 0:00:00
  Downloading scikit_learn-0.21.1-cp37-cp37m-manylinux1_x86_64.whl (6.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.7/6.7 MB 32.1 MB/s eta 0:00:00
  Downloading scikit_learn-0.21.0-cp37-cp37m-manylinux1_x86_64.whl (6.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.7/6.7 MB 38.5 MB/s eta 0:00:00
  Downloading scikit_learn-0.20.4-cp37-cp37m-manylinux1_x86_64.whl (5.4 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.4/5.4 MB 33.8 MB/s eta 0:00:00
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting requests<3.0.0,>=2.24.0 (from apache-beam==2.49.0.dev0)
  Using cached requests-2.30.0-py3-none-any.whl (62 kB)
  Using cached requests-2.29.0-py3-none-any.whl (62 kB)
  Using cached requests-2.28.2-py3-none-any.whl (62 kB)
  Downloading requests-2.28.1-py3-none-any.whl (62 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 kB 3.1 MB/s eta 0:00:00
Collecting charset-normalizer<3,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Downloading charset_normalizer-2.1.1-py3-none-any.whl (39 kB)
Collecting requests<3.0.0,>=2.24.0 (from apache-beam==2.49.0.dev0)
  Downloading requests-2.28.0-py3-none-any.whl (62 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 kB 8.2 MB/s eta 0:00:00
Collecting charset-normalizer~=2.0.0 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting requests<3.0.0,>=2.24.0 (from apache-beam==2.49.0.dev0)
  Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB)
  Downloading requests-2.27.0-py2.py3-none-any.whl (63 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.1/63.1 kB 9.1 MB/s eta 0:00:00
  Downloading requests-2.26.0-py2.py3-none-any.whl (62 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.3/62.3 kB 9.9 MB/s eta 0:00:00
  Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.2/61.2 kB 9.6 MB/s eta 0:00:00
Collecting chardet<5,>=3.0.2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 178.7/178.7 kB 10.9 MB/s eta 0:00:00
Collecting idna<3,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.8/58.8 kB 9.8 MB/s eta 0:00:00
Collecting requests<3.0.0,>=2.24.0 (from apache-beam==2.49.0.dev0)
  Downloading requests-2.25.0-py2.py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.1/61.1 kB 10.7 MB/s eta 0:00:00
Collecting chardet<4,>=3.0.2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting requests<3.0.0,>=2.24.0 (from apache-beam==2.49.0.dev0)
  Downloading requests-2.24.0-py2.py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.8/61.8 kB 9.2 MB/s eta 0:00:00
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Downloading urllib3-1.25.11-py2.py3-none-any.whl (127 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 128.0/128.0 kB 19.4 MB/s eta 0:00:00
Collecting regex>=2020.6.8 (from apache-beam==2.49.0.dev0)
  Using cached regex-2023.5.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (756 kB)
  Using cached regex-2023.5.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (756 kB)
ERROR: Exception:
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_internal/cli/base_command.py",> line 169, in exc_logging_wrapper
    status = run_func(*args)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_internal/cli/req_command.py",> line 248, in wrapper
    return func(self, options, args)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_internal/commands/install.py",> line 378, in run
    reqs, check_supported_wheels=not options.target_dir
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_internal/resolution/resolvelib/resolver.py",> line 93, in resolve
    collected.requirements, max_rounds=limit_how_complex_resolution_can_be
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py",> line 546, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/pip/_vendor/resolvelib/resolvers.py",> line 457, in resolve
    raise ResolutionTooDeep(max_rounds)
pip._vendor.resolvelib.resolvers.ResolutionTooDeep: 200000

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 34m 28s
14 actionable tasks: 8 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/kruyswoiiwfww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1036

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1036/display/redirect>

Changes:


------------------------------------------
[...truncated 28.79 KB...]
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.3.0rc1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.1 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.79.2-py3-none-any.whl (417 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.160 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.160-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.12.0.dev0-py3-none-any.whl (128 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.19.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.56.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.1.0-py3-none-any.whl (102 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.2.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.8/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (195 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.5.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-3.0.0a1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (613 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (81 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.1.0rc2-py3-none-any.whl (44 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.6.1-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0.dev0,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3215078 sha256=9cd86102158bd1b3d6efc0444b32d7d5aaa60dcfd15efff6c6a8e9ee8a41b8f8
  Stored in directory: /home/jenkins/.cache/pip/wheels/67/34/b8/5adce605a0a3f10491be234729fa4854c6a127bc01194f13fc
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, typing-extensions, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, attrs, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pytest, pymongo, pydot, pyasn1-modules, isodate, hypothesis, httplib2, grpcio-status, google-resumable-media, cffi, requests_mock, pytest-xdist, pytest-timeout, pandas, oauth2client, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.1 azure-identity-1.14.0b1 azure-storage-blob-12.17.0b1 boto3-1.26.160 botocore-1.29.160 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.12.0.dev0 google-apitools-0.5.31 google-auth-2.20.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.2 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.19.0 google-cloud-core-2.3.2 google-cloud-datastore-2.16.0 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.8.2 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.3 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-3.0.0a1 grpc-google-iam-v1-0.12.6 grpcio-status-1.56.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.79.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.3 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.4.0 pymysql-1.1.0rc2 pyparsing-3.1.0 pytest-7.4.0 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.3.0rc1 scipy-1.10.1 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 typing-extensions-4.7.0rc1 urllib3-1.26.16 websocket-client-1.6.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0625125342.1687705659.081809/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0625125342.1687705659.081809/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0625125342.1687705659.081809/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0625125342.1687705659.081809/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230625150739082758-9048'
 createTime: '2023-06-25T15:07:42.576657Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-25_08_07_39-2562734777383858037'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0625125342'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-25T15:07:42.576657Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-25_08_07_39-2562734777383858037]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-25_08_07_39-2562734777383858037
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-25_08_07_39-2562734777383858037?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-25_08_07_39-2562734777383858037 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:07:52.438Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:07:58.756Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:03.771Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.617Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.686Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.707Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.751Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.806Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.845Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.879Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.903Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.925Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.946Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:04.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.246Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.274Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.298Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.323Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.354Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.519Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.546Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:05.588Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-25_08_07_39-2562734777383858037 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:08:09.821Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:10:58.610Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:10:58.644Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:11:08.449Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:11:08.471Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:11:18.535Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:11:37.864Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:11:48.434Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:15:07.566Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:47:12.931Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T15:48:14.216Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T16:12:15.331Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T16:13:17.007Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T16:45:18.621Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T16:46:29.839Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T17:17:24.549Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T17:18:22.715Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T17:49:23.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T17:52:35.314Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T18:21:36.619Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T18:23:38.028Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T18:48:39.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T18:54:31.041Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T19:21:33.054Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T19:27:34.612Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T19:54:36.974Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T19:59:37.902Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-25_08_07_39-2562734777383858037 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.406Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-25_08_07_39-2562734777383858037.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.465Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.505Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.530Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.560Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-25T20:00:43.585Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1556, in wait_until_finish
    assert duration or terminated, (
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-25_08_07_39-2562734777383858037?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/dmd2vyb2urfn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org