You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/06/01 07:14:22 UTC
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #2075
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2075/display/redirect>
Changes:
------------------------------------------
[...truncated 81.80 KB...]
[31m[1m======== 1 failed, 33 passed, 1 skipped, 16 warnings in 2020.67 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests FAILED
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.31.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>> collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw2] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw0] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw3] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw4] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw1] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw7] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw5] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw6] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0mgw0 [32] / gw1 [32] / gw2 [32] / gw3 [32] / gw4 [32] / gw5 [32] / gw6 [32] / gw7 [32]
scheduling tests via LoadScheduling
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/pipeline_test.py::DoFnTest::test_key_param
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
[gw6] [33mSKIPPED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
[gw4] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
[gw5] [32mPASSED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
[gw4] [32mPASSED[0m apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
[gw3] [32mPASSED[0m apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
[gw0] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_key_param
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
[gw3] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
[gw0] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
[gw1] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
[gw2] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
[gw2] [31mFAILED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
[gw7] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
[gw1] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
[gw4] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
[gw7] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw5] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
[gw3] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
[gw2] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
[gw7] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw5] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
[gw1] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
[gw4] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
[gw2] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
[gw3] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
[gw4] [32mPASSED[0m apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[gw6] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw1] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
=================================== FAILURES ===================================
[31m[1m_______________ CombineFnLifecycleTest.test_non_liftable_combine _______________[0m
[gw2] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8>
self = <apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest testMethod=test_non_liftable_combine>
[1m @skip_unless_v2[0m
[1m def test_non_liftable_combine(self):[0m
[1m> run_combine(self.pipeline, lift_combiners=False)[0m
[1m[31mapache_beam/transforms/combinefn_lifecycle_test.py[0m:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/transforms/combinefn_lifecycle_pipeline.py[0m:118: in run_combine
[1m pcoll |= 'Do' >> beam.CombineGlobally([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:1908: in expand
[1m pcoll[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2275: in expand
[1m hot[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2052: in expand
[1m return pcoll | GroupByKey() | 'Combine' >> CombineValues([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:820: in apply_GroupByKey
[1m return transform.expand(pcoll)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <GroupByKey(PTransform) label=[GroupByKey] at 0x7fb131905cd0>
pcoll = <PCollection[Do/CombinePerKey/WindowIntoDiscarding.None] at 0x7fb131ff2160>
[1m def expand(self, pcoll):[0m
[1m from apache_beam.transforms.trigger import DataLossReason[0m
[1m from apache_beam.transforms.trigger import DefaultTrigger[0m
[1m windowing = pcoll.windowing[0m
[1m trigger = windowing.triggerfn[0m
[1m if not pcoll.is_bounded and isinstance([0m
[1m windowing.windowfn, GlobalWindows) and isinstance(trigger,[0m
[1m DefaultTrigger):[0m
[1m raise ValueError([0m
[1m 'GroupByKey cannot be applied to an unbounded ' +[0m
[1m 'PCollection with global windowing and a default trigger')[0m
[1m [0m
[1m if not pcoll.pipeline.allow_unsafe_triggers:[0m
[1m unsafe_reason = trigger.may_lose_data(windowing)[0m
[1m if unsafe_reason != DataLossReason.NO_POTENTIAL_LOSS:[0m
[1m msg = 'Unsafe trigger: `{}` may lose data. '.format(trigger)[0m
[1m msg += 'Reason: {}. '.format([0m
[1m str(unsafe_reason).replace('DataLossReason.', ''))[0m
[1m msg += 'This can be overriden with the --allow_unsafe_triggers flag.'[0m
[1m> raise ValueError(msg)[0m
[1m[31mE ValueError: Unsafe trigger: `AfterCount(5)` may lose data. Reason: CONDITION_NOT_GUARANTEED|MAY_FINISH. This can be overriden with the --allow_unsafe_triggers flag.[0m
[1m[31mapache_beam/transforms/core.py[0m:2335: ValueError
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 1 failed, 30 passed, 1 skipped, 8 warnings in 2207.70 seconds =========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 156
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 189
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 13m 48s
81 actionable tasks: 52 executed, 29 from cache
Publishing build scan...
https://gradle.com/s/iggcn33nvnvhq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow_V2
#2078
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2078/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #2077
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2077/display/redirect?page=changes>
Changes:
[zyichi] Minor fix to prebuilding sdk workflow timeout setting
[Ismaël Mejía] [BEAM-12423] Upgrade pyarrow to support version 4.0.0 too
------------------------------------------
[...truncated 81.98 KB...]
[31m[1m======== 1 failed, 33 passed, 1 skipped, 10 warnings in 2111.74 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests FAILED
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.31.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>> collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw1] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw2] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw6] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw3] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw4] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw7] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw5] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0mgw0 [32] / gw1 [32] / gw2 [32] / gw3 [32] / gw4 [32] / gw5 [32] / gw6 [32] / gw7 [32]
scheduling tests via LoadScheduling
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/pipeline_test.py::DoFnTest::test_key_param
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
[gw5] [33mSKIPPED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
[gw6] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
[gw2] [32mPASSED[0m apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
[gw2] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
[gw6] [32mPASSED[0m apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
[gw0] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
[gw0] [31mFAILED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
[gw1] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_key_param
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
[gw1] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
[gw7] [32mPASSED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
[gw4] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
[gw3] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
[gw5] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
[gw2] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
[gw1] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
[gw4] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
[gw7] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
[gw3] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
[gw5] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw4] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
[gw2] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw1] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
[gw3] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
[gw7] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
[gw5] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw2] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw1] [32mPASSED[0m apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
=================================== FAILURES ===================================
[31m[1m_______________ CombineFnLifecycleTest.test_non_liftable_combine _______________[0m
[gw0] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8>
self = <apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest testMethod=test_non_liftable_combine>
[1m @skip_unless_v2[0m
[1m def test_non_liftable_combine(self):[0m
[1m> run_combine(self.pipeline, lift_combiners=False)[0m
[1m[31mapache_beam/transforms/combinefn_lifecycle_test.py[0m:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/transforms/combinefn_lifecycle_pipeline.py[0m:118: in run_combine
[1m pcoll |= 'Do' >> beam.CombineGlobally([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:1908: in expand
[1m pcoll[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2275: in expand
[1m hot[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2052: in expand
[1m return pcoll | GroupByKey() | 'Combine' >> CombineValues([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:820: in apply_GroupByKey
[1m return transform.expand(pcoll)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <GroupByKey(PTransform) label=[GroupByKey] at 0x7f3677c8b3a0>
pcoll = <PCollection[Do/CombinePerKey/WindowIntoDiscarding.None] at 0x7f3677c3c910>
[1m def expand(self, pcoll):[0m
[1m from apache_beam.transforms.trigger import DataLossReason[0m
[1m from apache_beam.transforms.trigger import DefaultTrigger[0m
[1m windowing = pcoll.windowing[0m
[1m trigger = windowing.triggerfn[0m
[1m if not pcoll.is_bounded and isinstance([0m
[1m windowing.windowfn, GlobalWindows) and isinstance(trigger,[0m
[1m DefaultTrigger):[0m
[1m raise ValueError([0m
[1m 'GroupByKey cannot be applied to an unbounded ' +[0m
[1m 'PCollection with global windowing and a default trigger')[0m
[1m [0m
[1m if not pcoll.pipeline.allow_unsafe_triggers:[0m
[1m unsafe_reason = trigger.may_lose_data(windowing)[0m
[1m if unsafe_reason != DataLossReason.NO_POTENTIAL_LOSS:[0m
[1m msg = 'Unsafe trigger: `{}` may lose data. '.format(trigger)[0m
[1m msg += 'Reason: {}. '.format([0m
[1m str(unsafe_reason).replace('DataLossReason.', ''))[0m
[1m msg += 'This can be overriden with the --allow_unsafe_triggers flag.'[0m
[1m> raise ValueError(msg)[0m
[1m[31mE ValueError: Unsafe trigger: `AfterCount(5)` may lose data. Reason: CONDITION_NOT_GUARANTEED|MAY_FINISH. This can be overriden with the --allow_unsafe_triggers flag.[0m
[1m[31mapache_beam/transforms/core.py[0m:2335: ValueError
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 1 failed, 30 passed, 1 skipped, 8 warnings in 2361.02 seconds =========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 156
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 189
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 19m 57s
81 actionable tasks: 54 executed, 27 from cache
Publishing build scan...
https://gradle.com/s/uam4yvxt76bjc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #2076
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2076/display/redirect>
Changes:
------------------------------------------
[...truncated 81.07 KB...]
[31m[1m======== 1 failed, 33 passed, 1 skipped, 16 warnings in 2043.40 seconds ========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests FAILED
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.31.0-SNAPSHOT.jar> --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --enable_streaming_engine
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8
>>> collect markers: -m=it_validatesrunner and not no_sickbay_streaming and not no_xdist
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw1] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw0] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw3] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw2] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw5] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw6] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw4] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0m[1m[gw7] Python 3.8.5 (default, Jul 20 2020, 19:50:14) -- [GCC 5.4.0 20160609]
[0mgw0 [32] / gw1 [32] / gw2 [32] / gw3 [32] / gw4 [32] / gw5 [32] / gw6 [32] / gw7 [32]
scheduling tests via LoadScheduling
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/pipeline_test.py::DoFnTest::test_key_param
[gw7] [33mSKIPPED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combining_value_state
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
[gw2] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/fn_runner_test.py::FnApiBasedStateBackedCoderTest::test_gbk_many_values
apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
[gw1] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_element_param
apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
[gw1] [31mFAILED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_non_liftable_combine
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
[gw4] [32mPASSED[0m apache_beam/transforms/combinefn_lifecycle_test.py::CombineFnLifecycleTest::test_combine
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
[gw7] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_pcollections
apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
[gw3] [32mPASSED[0m apache_beam/metrics/metric_test.py::MetricsTest::test_user_counter_using_pardo
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
[gw2] [32mPASSED[0m apache_beam/transforms/dofn_lifecycle_test.py::DoFnLifecycleTest::test_dofn_lifecycle
apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
[gw3] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_latest
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
[gw6] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_per_key
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
[gw0] [32mPASSED[0m apache_beam/pipeline_test.py::DoFnTest::test_key_param
apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
[gw0] [32mPASSED[0m apache_beam/transforms/combiners_test.py::TimestampCombinerTest::test_combiner_earliest
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
[gw5] [32mPASSED[0m apache_beam/runners/portability/fn_api_runner/translations_test.py::TranslationsTest::test_run_packable_combine_globally
apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
[gw1] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_return
apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_default_value_singleton_side_input
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
[gw4] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_one_single_pcollection
apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
[gw6] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_multiple_pcollections_having_multiple_consumers
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
[gw7] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_impulse
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
[gw2] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_multiple_empty_outputs
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
[gw3] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
[gw5] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_flatten_a_flattened_pcollection
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw1] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_par_do_with_multiple_outputs_and_using_yield
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_empty_singleton_side_input
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
[gw4] [32mPASSED[0m apache_beam/transforms/ptransform_test.py::PTransformTest::test_undeclared_outputs
apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[gw7] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_dict_twice
[gw5] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_flattened_side_input
[gw6] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_without_unique_labels
[gw3] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_singleton_with_different_defaults
[gw2] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_twice
[gw1] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_iterable_side_input
[gw0] [32mPASSED[0m apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_reiterable_side_input
[gw4] [32mPASSED[0m apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
=================================== FAILURES ===================================
[31m[1m_______________ CombineFnLifecycleTest.test_non_liftable_combine _______________[0m
[gw1] linux -- Python 3.8.5 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8>
self = <apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest testMethod=test_non_liftable_combine>
[1m @skip_unless_v2[0m
[1m def test_non_liftable_combine(self):[0m
[1m> run_combine(self.pipeline, lift_combiners=False)[0m
[1m[31mapache_beam/transforms/combinefn_lifecycle_test.py[0m:69:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/transforms/combinefn_lifecycle_pipeline.py[0m:118: in run_combine
[1m pcoll |= 'Do' >> beam.CombineGlobally([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:1908: in expand
[1m pcoll[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:640: in apply
[1m return self.apply([0m
[1m[31mapache_beam/pipeline.py[0m:651: in apply
[1m return self.apply(transform, pvalueish)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2275: in expand
[1m hot[0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform
[1m return transform.expand(input)[0m
[1m[31mapache_beam/transforms/core.py[0m:2052: in expand
[1m return pcoll | GroupByKey() | 'Combine' >> CombineValues([0m
[1m[31mapache_beam/pvalue.py[0m:136: in __or__
[1m return self.pipeline.apply(ptransform, self)[0m
[1m[31mapache_beam/pipeline.py[0m:694: in apply
[1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply
[1m return super(DataflowRunner, self).apply(transform, input, options)[0m
[1m[31mapache_beam/runners/runner.py[0m:185: in apply
[1m return m(transform, input, options)[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:820: in apply_GroupByKey
[1m return transform.expand(pcoll)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <GroupByKey(PTransform) label=[GroupByKey] at 0x7f633a6f1af0>
pcoll = <PCollection[Do/CombinePerKey/WindowIntoDiscarding.None] at 0x7f633a9c7160>
[1m def expand(self, pcoll):[0m
[1m from apache_beam.transforms.trigger import DataLossReason[0m
[1m from apache_beam.transforms.trigger import DefaultTrigger[0m
[1m windowing = pcoll.windowing[0m
[1m trigger = windowing.triggerfn[0m
[1m if not pcoll.is_bounded and isinstance([0m
[1m windowing.windowfn, GlobalWindows) and isinstance(trigger,[0m
[1m DefaultTrigger):[0m
[1m raise ValueError([0m
[1m 'GroupByKey cannot be applied to an unbounded ' +[0m
[1m 'PCollection with global windowing and a default trigger')[0m
[1m [0m
[1m if not pcoll.pipeline.allow_unsafe_triggers:[0m
[1m unsafe_reason = trigger.may_lose_data(windowing)[0m
[1m if unsafe_reason != DataLossReason.NO_POTENTIAL_LOSS:[0m
[1m msg = 'Unsafe trigger: `{}` may lose data. '.format(trigger)[0m
[1m msg += 'Reason: {}. '.format([0m
[1m str(unsafe_reason).replace('DataLossReason.', ''))[0m
[1m msg += 'This can be overriden with the --allow_unsafe_triggers flag.'[0m
[1m> raise ValueError(msg)[0m
[1m[31mE ValueError: Unsafe trigger: `AfterCount(5)` may lose data. Reason: CONDITION_NOT_GUARANTEED|MAY_FINISH. This can be overriden with the --allow_unsafe_triggers flag.[0m
[1m[31mapache_beam/transforms/core.py[0m:2335: ValueError
[33m=============================== warnings summary ===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml> -
[31m[1m======== 1 failed, 30 passed, 1 skipped, 8 warnings in 2239.29 seconds =========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 156
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 189
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 14m 38s
81 actionable tasks: 52 executed, 29 from cache
Publishing build scan...
https://gradle.com/s/u4gp44tofgsk2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org