You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/06/08 22:10:01 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #3532

See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/3532/display/redirect?page=changes>

Changes:

[dannymccormick] Gather metrics on GH Issues

[dannymccormick] Fixes

[dannymccormick] Fixes

[dannymccormick] Comment + naming fix

[dannymccormick] Conflicts fix

[dannymccormick] Ordering

[dannymccormick] Different fallback for prs/issues

[noreply] Add ability to self-assign issues for non-committers (#21719)

[dannymccormick] Fix sync time

[noreply] Dont try to generate jiras as part of dependency report (#21753)

[noreply] Allow users to comment `.take-issue` without taking (#21755)


------------------------------------------
[...truncated 581.62 KB...]
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.40.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>>   collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
============================= test session starts ==============================
platform linux -- Python 3.9.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw2] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw1] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw4] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw3] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw5] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw7] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
[gw6] Python 3.9.10 (main, Jan 15 2022, 18:17:56)  -- [GCC 9.3.0]
gw0 [33] / gw1 [33] / gw2 [33] / gw3 [33] / gw4 [33] / gw5 [33] / gw6 [33] / gw7 [33]

scheduling tests via LoadScheduling

apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/pipeline_test.py::DoFnTest::test_key_param 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_limit 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
[gw4] PASSED apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
[gw4] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
[gw1] PASSED apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
[gw1] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
[gw3] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
[gw0] PASSED apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
[gw0] SKIPPED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
[gw6] PASSED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
[gw7] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
[gw2] PASSED apache_beam/pipeline_test.py::DoFnTest::test_key_param 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
[gw5] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_limit 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
[gw1] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
[gw3] PASSED apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
[gw7] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
[gw2] PASSED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw6] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
[gw5] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
[gw1] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw7] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 
[gw2] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw3] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 
[gw6] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
[gw5] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
[gw4] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
[gw1] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw7] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 
[gw0] PASSED apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 
[gw3] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py39-xdist.xml> -
============= 32 passed, 2 skipped, 8 warnings in 2347.59 seconds ==============
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.40.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO
>>>   collect markers: -m=it_validatesrunner and not no_sickbay_streaming and no_xdist
============================= test session starts ==============================
platform linux -- Python 3.9.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False

----------------------------- live log collection ------------------------------
WARNING  root:avroio_test.py:51 python-snappy is not installed; some tests will be skipped.
WARNING  root:tfrecordio_test.py:55 Tensorflow is not installed, so skipping some tests.
WARNING  apache_beam.runners.interactive.interactive_environment:interactive_environment.py:190 Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
WARNING  apache_beam.runners.interactive.interactive_environment:interactive_environment.py:199 You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal.
INFO     root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.9_sdk:2.40.0.dev
collected 5621 items / 5620 deselected / 1 skipped

apache_beam/runners/dataflow/dataflow_exercise_streaming_metrics_pipeline_test.py::ExerciseStreamingMetricsPipelineTest::test_streaming_pipeline_returns_expected_user_metrics_fnapi_it 
-------------------------------- live log call ---------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/bin/python3.9',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp7xw0ivrm/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
E0608 22:02:01.571235428 2073118 fork_posix.cc:76]           Other threads are currently calling into gRPC, skipping fork() handlers
INFO     apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:476 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.9_sdk:2.40.0.dev
INFO     root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python39-fnapi:beam-master-20220512
INFO     root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python39-fnapi:beam-master-20220512" for Docker environment
INFO     apache_beam.internal.gcp.auth:auth.py:136 Setting socket default timeout to 60 seconds.
INFO     apache_beam.internal.gcp.auth:auth.py:138 socket default timeout is 60.0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/requirements.txt...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/requirements.txt in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/pickled_main_session...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/pickled_main_session in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/mock-2.0.0-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/seaborn-0.11.2-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/beautifulsoup4-4.11.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/dataflow-worker.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/dataflow-worker.jar in 6 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0608220202-388944-nthvafm2.1654725722.389116/pipeline.pb in 0 seconds.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:335 Discarding unparseable args: ['--sleep_secs=20']
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:335 Discarding unparseable args: ['--sleep_secs=20']
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:897 Create job: <Job
                                                                           clientRequestId: '20220608220202390189-2535'
                                                                           createTime: '2022-06-08T22:02:10.878240Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2022-06-08_15_02_10-13342899750533358742'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-0608220202-388944-nthvafm2'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2022-06-08T22:02:10.878240Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:899 Created job with id: [2022-06-08_15_02_10-13342899750533358742]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:900 Submitted job: 2022-06-08_15_02_10-13342899750533358742
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:901 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-08_15_02_10-13342899750533358742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-08_15_02_10-13342899750533358742?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: 
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-08_15_02_10-13342899750533358742?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-06-08_15_02_10-13342899750533358742 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:14.774Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:14.924Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-06-08_15_02_10-13342899750533358742. The number of workers will be between 1 and 100.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:14.986Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-06-08_15_02_10-13342899750533358742.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:19.339Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:20.012Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:20.832Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:20.912Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:20.940Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:20.988Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.022Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.044Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.080Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.106Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.138Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/ToProtobuf into generate_metrics
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.170Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write into dump_to_pub/ToProtobuf
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.206Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.237Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.266Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.298Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.331Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.384Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.420Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:21.485Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:35.234Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:02:53.298Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-06-08T22:03:21.720Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING  apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:253 Timing out on waiting for job 2022-06-08_15_02_10-13342899750533358742 after 61 seconds
PASSED

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py39-noxdist.xml> -
====== 1 passed, 1 skipped, 5620 deselected, 1 warnings in 504.46 seconds ======

> Task :sdks:python:test-suites:dataflow:validatesRunnerStreamingTestsV2

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 232

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 35s
93 actionable tasks: 58 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cbha7j43nmug4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow_V2 #3533

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/3533/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org