You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/06/01 13:15:11 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #2076

See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2076/display/redirect>

Changes:


------------------------------------------
[...truncated 81.07 KB...]
======== 1 failed, 33 passed, 1 skipped, 16 warnings in 2043.40 seconds ========

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests FAILED

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.31.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>>   collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
gw0 [32] / gw1 [32] / gw2 [32] / gw3 [32] / gw4 [32] / gw5 [32] / gw6 [32] / gw7 [32]

scheduling tests via LoadScheduling

apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
apache_beam/pipeline_test.py::DoFnTest::test_key_param 
[gw7] SKIPPED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
[gw2] PASSED apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
[gw1] PASSED apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
[gw1] FAILED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
[gw4] PASSED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
[gw7] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
[gw3] PASSED apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
[gw2] PASSED apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
[gw3] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
[gw6] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
[gw0] PASSED apache_beam/pipeline_test.py::DoFnTest::test_key_param 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
[gw0] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
[gw5] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
[gw1] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
[gw6] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw7] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
[gw2] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
[gw3] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
[gw5] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw1] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 
[gw7] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
[gw5] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw6] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw3] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
[gw2] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
[gw1] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 
[gw4] PASSED apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 

=================================== FAILURES ===================================
_______________ CombineFnLifecycleTest.test_non_liftable_combine _______________
[gw1] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8>

self = <apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest testMethod=test_non_liftable_combine>

    @skip_unless_v2
    def test_non_liftable_combine(self):
>     run_combine(self.pipeline, lift_combiners=False)

apache_beam/transforms/combinefn_lifecycle_test.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/transforms/combinefn_lifecycle_pipeline.py:118: in run_combine
    pcoll |= 'Do' >> beam.CombineGlobally(
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:640: in apply
    return self.apply(
apache_beam/pipeline.py:651: in apply
    return self.apply(transform, pvalueish)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:1908: in expand
    pcoll
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:640: in apply
    return self.apply(
apache_beam/pipeline.py:651: in apply
    return self.apply(transform, pvalueish)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:2275: in expand
    hot
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:2052: in expand
    return pcoll | GroupByKey() | 'Combine' >> CombineValues(
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/dataflow/dataflow_runner.py:820: in apply_GroupByKey
    return transform.expand(pcoll)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <GroupByKey(PTransform) label=[GroupByKey] at 0x7f633a6f1af0>
pcoll = <PCollection[Do/CombinePerKey/WindowIntoDiscarding.None] at 0x7f633a9c7160>

    def expand(self, pcoll):
      from apache_beam.transforms.trigger import DataLossReason
      from apache_beam.transforms.trigger import DefaultTrigger
      windowing = pcoll.windowing
      trigger = windowing.triggerfn
      if not pcoll.is_bounded and isinstance(
          windowing.windowfn, GlobalWindows) and isinstance(trigger,
                                                            DefaultTrigger):
        raise ValueError(
            'GroupByKey cannot be applied to an unbounded ' +
            'PCollection with global windowing and a default trigger')
    
      if not pcoll.pipeline.allow_unsafe_triggers:
        unsafe_reason = trigger.may_lose_data(windowing)
        if unsafe_reason != DataLossReason.NO_POTENTIAL_LOSS:
          msg = 'Unsafe trigger: `{}` may lose data. '.format(trigger)
          msg += 'Reason: {}. '.format(
              str(unsafe_reason).replace('DataLossReason.', ''))
          msg += 'This can be overriden with the --allow_unsafe_triggers flag.'
>         raise ValueError(msg)
E         ValueError: Unsafe trigger: `AfterCount(5)` may lose data. Reason: CONDITION_NOT_GUARANTEED|MAY_FINISH. This can be overriden with the --allow_unsafe_triggers flag.

apache_beam/transforms/core.py:2335: ValueError
=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
======== 1 failed, 30 passed, 1 skipped, 8 warnings in 2239.29 seconds =========

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 156

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 189

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 38s
81 actionable tasks: 52 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/u4gp44tofgsk2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow_V2 #2078

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2078/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #2077

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2077/display/redirect?page=changes>

Changes:

[zyichi] Minor fix to prebuilding sdk workflow timeout setting

[Ismaël Mejía] [BEAM-12423] Upgrade pyarrow to support version 4.0.0 too


------------------------------------------
[...truncated 81.98 KB...]
======== 1 failed, 33 passed, 1 skipped, 10 warnings in 2111.74 seconds ========

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests FAILED

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.31.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>>   collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.8.5 (default, Jul 20 2020, 19:50:14)  -- [GCC 5.4.0 20160609]
gw0 [32] / gw1 [32] / gw2 [32] / gw3 [32] / gw4 [32] / gw5 [32] / gw6 [32] / gw7 [32]

scheduling tests via LoadScheduling

apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/pipeline_test.py::DoFnTest::test_key_param 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
[gw5] SKIPPED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
[gw6] PASSED apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values 
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
[gw2] PASSED apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
[gw2] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
[gw6] PASSED apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
[gw0] PASSED apache_beam/pipeline_test.py::DoFnTest::test_element_param 
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
[gw0] FAILED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
[gw1] PASSED apache_beam/pipeline_test.py::DoFnTest::test_key_param 
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
[gw1] PASSED apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
[gw7] PASSED apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
[gw4] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
[gw3] PASSED apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
[gw5] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
[gw2] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
[gw1] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
[gw4] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
[gw7] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw3] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
[gw6] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs 
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
[gw5] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw4] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input 
[gw2] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 
[gw1] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults 
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 
[gw3] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input 
[gw6] PASSED apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs 
[gw7] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels 
[gw5] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input 
[gw2] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input 
[gw1] PASSED apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps 
[gw0] PASSED apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input 

=================================== FAILURES ===================================
_______________ CombineFnLifecycleTest.test_non_liftable_combine _______________
[gw0] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8>

self = <apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest testMethod=test_non_liftable_combine>

    @skip_unless_v2
    def test_non_liftable_combine(self):
>     run_combine(self.pipeline, lift_combiners=False)

apache_beam/transforms/combinefn_lifecycle_test.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/transforms/combinefn_lifecycle_pipeline.py:118: in run_combine
    pcoll |= 'Do' >> beam.CombineGlobally(
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:640: in apply
    return self.apply(
apache_beam/pipeline.py:651: in apply
    return self.apply(transform, pvalueish)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:1908: in expand
    pcoll
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:640: in apply
    return self.apply(
apache_beam/pipeline.py:651: in apply
    return self.apply(transform, pvalueish)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:2275: in expand
    hot
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
    return transform.expand(input)
apache_beam/transforms/core.py:2052: in expand
    return pcoll | GroupByKey() | 'Combine' >> CombineValues(
apache_beam/pvalue.py:136: in __or__
    return self.pipeline.apply(ptransform, self)
apache_beam/pipeline.py:694: in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/dataflow/dataflow_runner.py:141: in apply
    return super(DataflowRunner, self).apply(transform, input, options)
apache_beam/runners/runner.py:185: in apply
    return m(transform, input, options)
apache_beam/runners/dataflow/dataflow_runner.py:820: in apply_GroupByKey
    return transform.expand(pcoll)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <GroupByKey(PTransform) label=[GroupByKey] at 0x7f3677c8b3a0>
pcoll = <PCollection[Do/CombinePerKey/WindowIntoDiscarding.None] at 0x7f3677c3c910>

    def expand(self, pcoll):
      from apache_beam.transforms.trigger import DataLossReason
      from apache_beam.transforms.trigger import DefaultTrigger
      windowing = pcoll.windowing
      trigger = windowing.triggerfn
      if not pcoll.is_bounded and isinstance(
          windowing.windowfn, GlobalWindows) and isinstance(trigger,
                                                            DefaultTrigger):
        raise ValueError(
            'GroupByKey cannot be applied to an unbounded ' +
            'PCollection with global windowing and a default trigger')
    
      if not pcoll.pipeline.allow_unsafe_triggers:
        unsafe_reason = trigger.may_lose_data(windowing)
        if unsafe_reason != DataLossReason.NO_POTENTIAL_LOSS:
          msg = 'Unsafe trigger: `{}` may lose data. '.format(trigger)
          msg += 'Reason: {}. '.format(
              str(unsafe_reason).replace('DataLossReason.', ''))
          msg += 'This can be overriden with the --allow_unsafe_triggers flag.'
>         raise ValueError(msg)
E         ValueError: Unsafe trigger: `AfterCount(5)` may lose data. Reason: CONDITION_NOT_GUARANTEED|MAY_FINISH. This can be overriden with the --allow_unsafe_triggers flag.

apache_beam/transforms/core.py:2335: ValueError
=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
======== 1 failed, 30 passed, 1 skipped, 8 warnings in 2361.02 seconds =========

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 156

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 189

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 57s
81 actionable tasks: 54 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/uam4yvxt76bjc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org