You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/12 21:27:01 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #3776

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3776/display/redirect?page=changes>

Changes:

[je.ik] [BEAM-7529] Add Sums.ofFloats() and Sums.ofDoubles()

------------------------------------------
[...truncated 167.70 KB...]
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_23-5528666824736420115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_37_05-13019739813829589992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_25-2089532931657751083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_38_05-11145615976665915765?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_26-5704603245485908667?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_37_42-6902445666340412261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_30-8774188714010933488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_38_15-15191122730361246857?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_24-11111850917672698395?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_36_33-2483930478387744869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_44_22-9201025236002149227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_26-7096135809707440271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_38_27-1547813356780484065?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_25-17656137538718778251?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_39_37-2407674716798896593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_29_25-13721000759588404928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_39_46-11484388583897733237?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 17 tests in 1873.277s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "

> Task :sdks:python:validatesRunnerStreamingTests
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 1643.205s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_07-9501278361587011433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_08-7294593365137604489?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_01_19-7929103688690358219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_08-16594593142817982888?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_03_34-10111721611050982432?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_09-13023352887525065714?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_02_28-3788047148370246705?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_09-4538367692621135163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_08-15842098818146562770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_01_59-17001010894317868063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_08-18000457129240277184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_44-6575464008130077327?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_52_08-17026862221172640076?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_49-14087103716220116054?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_08_14-3812350381493917711?project=apache-beam-testing.

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_28-11609557260225293004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_06_09-17102279527278047829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_29-3160248962177097786?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_07_09-8084110822992172591?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_28-2409220523375207317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_07_49-16627137617519111056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_28-886562169851473453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_09_42-1400071909626923159?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_29-9106710583248523156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_45-6572821798921073251?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_29-15571036098041858285?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_06_07-7321880680065787690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_29-11975496124104777726?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_13_56_29-15157075100801770232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_08_17-15559165607666274499?project=apache-beam-testing.
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 1542.384s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-270190583323400955?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_49-3899450082144811982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-10619606410707383954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_50-5072924922668978530?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_35-11973279467791850054?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_46-15832287920213212650?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-12514834993210267495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_49-4061707150715716799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-6009394778197582967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_10_25-11292136202205896304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_34-5069242872755243363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_09_12-13703737225890690954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-8726447223416225180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_00_33-17967628465916410419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_14_09_38-9549012406938488628?project=apache-beam-testing.
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 1596.117s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 67

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 87

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 47s
76 actionable tasks: 59 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/sldnbuvcy2i2s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #3778

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3778/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #3777

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/3777/display/redirect?page=changes>

Changes:

[hsuryawirawan] Add side input and side output java katas

[hsuryawirawan] Add side input and side output python katas

[hsuryawirawan] Modify the TestPipeline declaration to be 'final transient'

[hsuryawirawan] Add Create.of transform names in CoGroupByKey kata

[hsuryawirawan] Improve the task description of ParDo OneToMany to describe

[hsuryawirawan] Update offset for some tasks

[hsuryawirawan] Update the task descriptions: formatting, conventions, and styles

[hsuryawirawan] Update Beam version to v2.13.0

[hsuryawirawan] Add BinaryCombineFn Lambda java kata

[hsuryawirawan] Add packages for all Java katas

[hsuryawirawan] Add composite transform Java kata

[hsuryawirawan] Add composite transform Python kata

[hsuryawirawan] Add 'public' modifier to early Task classes

[hsuryawirawan] Rename Tests classes to TaskTest

[hsuryawirawan] Add branching Java kata

[hsuryawirawan] Add branching Python kata

[hsuryawirawan] Add TextIO Read Java kata

[hsuryawirawan] Add the Kata descriptions for Branching and TextIO Read

[hsuryawirawan] Add TextIO ReadFromText Python kata

[hsuryawirawan] Add Kata description from Branching Python kata

[hsuryawirawan] Add Built-in IOs Java kata

[hsuryawirawan] Add Built-in IOs Python kata

[hsuryawirawan] Add missing copyright for Python kata "Built-in IOs" test file

[hsuryawirawan] Add rat exclusion for Katas IO txt files

[hsuryawirawan] Removed unused entry in Java kata study_project.xml

------------------------------------------
[...truncated 443.32 KB...]
                  "@type": "kind:pair",
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    },
                    {
                      "@type": "kind:varint"
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "ref_AppliedPTransform_Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1_24",
        "user_name": "Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-06-12T23:41:19.883333Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-12_16_41_18-8615114045352486148'
 location: 'us-central1'
 name: 'beamapp-jenkins-0612234110-875729'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-12T23:41:19.883333Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-06-12_16_41_18-8615114045352486148]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_41_18-8615114045352486148?project=apache-beam-testing
root: WARNING: Waiting indefinitely for streaming job.
root: INFO: Job 2019-06-12_16_41_18-8615114045352486148 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-12T23:41:22.443Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-12T23:41:23.009Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-06-12T23:41:23.495Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-06-12T23:41:23.497Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-06-12T23:41:23.506Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-12T23:41:23.513Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-12T23:41:23.515Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-12T23:41:23.517Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-06-12T23:41:23.520Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-06-12T23:41:23.530Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-12T23:41:23.546Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-12T23:41:23.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey0 into side list/Decode Values
root: INFO: 2019-06-12T23:41:23.551Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1 into side list/Decode Values
root: INFO: 2019-06-12T23:41:23.553Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey0 into side list/Decode Values
root: INFO: 2019-06-12T23:41:23.555Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1 into side list/Decode Values
root: INFO: 2019-06-12T23:41:23.558Z: JOB_MESSAGE_DETAILED: Unzipping flatten s16 for input s14.out
root: INFO: 2019-06-12T23:41:23.561Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-06-12T23:41:23.562Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
root: INFO: 2019-06-12T23:41:23.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values
root: INFO: 2019-06-12T23:41:23.567Z: JOB_MESSAGE_DETAILED: Fusing consumer main input/Decode Values into main input/Impulse
root: INFO: 2019-06-12T23:41:23.570Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-06-12T23:41:23.572Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey into Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey0
root: INFO: 2019-06-12T23:41:23.575Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/StreamingPCollectionViewWriter into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values
root: INFO: 2019-06-12T23:41:23.580Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
root: INFO: 2019-06-12T23:41:23.585Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-06-12T23:41:23.589Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
root: INFO: 2019-06-12T23:41:23.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets
root: INFO: 2019-06-12T23:41:23.594Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-06-12T23:41:23.596Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-06-12T23:41:23.598Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>)
root: INFO: 2019-06-12T23:41:23.600Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>) into main input/Decode Values
root: INFO: 2019-06-12T23:41:23.603Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Decode Values
root: INFO: 2019-06-12T23:41:23.605Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Decode Values into assert_that/Create/Impulse
root: INFO: 2019-06-12T23:41:23.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets
root: INFO: 2019-06-12T23:41:23.609Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/Decode Values into side list/Impulse
root: INFO: 2019-06-12T23:41:23.611Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream
root: INFO: 2019-06-12T23:41:23.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/WriteStream into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey
root: INFO: 2019-06-12T23:41:23.615Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/ReadStream
root: INFO: 2019-06-12T23:41:23.617Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey into Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1
root: INFO: 2019-06-12T23:41:23.620Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream into Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey
root: INFO: 2019-06-12T23:41:23.633Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-12T23:41:23.690Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-12T23:41:23.740Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-12T23:41:23.896Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-06-12T23:41:23.909Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-12T23:41:23.913Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-06-12T23:41:26.052Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/Decode Values+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
root: INFO: 2019-06-12T23:41:26.052Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/StreamingPCollectionViewWriter
root: INFO: 2019-06-12T23:41:26.055Z: JOB_MESSAGE_BASIC: Executing operation side list/Impulse+side list/Decode Values+Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:217>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/WriteStream+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream
root: INFO: 2019-06-12T23:41:26.055Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values+Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter
root: INFO: 2019-06-12T23:41:26.055Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
root: INFO: 2019-06-12T23:41:26.099Z: JOB_MESSAGE_BASIC: Executing operation main input/Impulse+main input/Decode Values+Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
root: INFO: 2019-06-12T23:41:57.491Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-12T23:42:35.551Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_50-3588337518712755348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_42_25-16333994002810410551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_53-7555933571795035727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_41_14-1585495861134184116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_54-10568906323176720081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_41_09-11899798691230961144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_53-16794786679613364887?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_41_29-11492929439133144714?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_54-17353396110368428611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_42_24-16098292708937505689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_53-7950501769091083440?project=apache-beam-testing.
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 746, in list_messages
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_42_23-17340222157944696862?project=apache-beam-testing.
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-12_16_42_23-17340222157944696862/messages?alt=json&startTime=2019-06-12T23%3A43%3A58.916Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Wed, 12 Jun 2019 23:45:54 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(f3d9e4ee4281f323): Information about job 2019-06-12_16_42_23-17340222157944696862 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_53-17502620284991375546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_30_53-3835262500541363322?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-12_16_41_18-8615114045352486148?project=apache-beam-testing.
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 746, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-12_16_41_18-8615114045352486148/messages?alt=json&startTime=2019-06-12T23%3A42%3A35.551Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Wed, 12 Jun 2019 23:46:04 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '278', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(d91189521d156cd): Information about job 2019-06-12_16_41_18-8615114045352486148 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>


----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 2342.834s

FAILED (failures=2)

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 87

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 87

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 36s
72 actionable tasks: 55 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wiw4c4muvsbm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org