You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/10/10 12:48:44 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4769

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4769/display/redirect>

Changes:


------------------------------------------
[...truncated 194.99 KB...]
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_25_09-13603866822996857432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_33_28-8629594627601049110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_25_08-15184113322515759681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_33_51-5717666979992233559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_25_06-6680162121716613604?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_33_38-10923212659225080518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_25_09-6001128725157960105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_33_27-3449182620244951629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_25_08-10353026875912346071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_33_30-17201412330330130540?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ERROR
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

======================================================================
ERROR: test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/ptransform_test.py",> line 242, in test_par_do_with_multiple_outputs_and_using_yield
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 484, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 530, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 560, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 490, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.7/subprocess.py", line 395, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.7/subprocess.py", line 487, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1.
 
 Pip install failed for package: -r         
 Output from execution of subprocess: b'Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))\n  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz\nCollecting mock (from -r postcommit_requirements.txt (line 2))\n  File was already downloaded /tmp/dataflow-requirements-cache/mock-3.0.5.tar.gz\nCollecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))\n  ERROR: Could not find a version that satisfies the requirement setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) (from versions: none)\nERROR: No matching distribution found for setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))\n'
-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1010122905-596844.1570710545.597017/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1010122905-596844.1570710545.597017/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1010122905-596844.1570710545.597017/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1010122905-596844.1570710545.597017/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_22-11895853363894008941?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_37_59-10785510369824958757?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_15-9846933298840271923?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_03-16129995451400638511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_19-11795770634317916797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_16-10148891729078533077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_24-3127463051672730494?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_17-11711935985632274275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_44-1021868720259357646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_20-18103919983053808150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_42-3805465173856689755?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_16-4277348219521291112?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_49-17455579697750164079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_16-4568731592644846670?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_59-6077946322998847602?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1125.383s

FAILED (errors=1)

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests FAILED

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_36-1810617434513177406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_20-696372614863375816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_44-5573790182690839833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_54-3830497415976822735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_39-3177624233424445590?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_24-4049689903682462531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_38-14925718068900238150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_59-8479119326667681251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_37-1797318936311394053?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_47-10803508620156969513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_38-4004323883666431436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_27-14592294258502025402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_37-4156354565281402949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_18-13657358402704771344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_36-16712082472620552470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_41-18095945226845710915?project=apache-beam-testing
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1176.190s

OK

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_55-1898368996195712929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_02-10302185592139297734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_50-16112183069337268507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_02-5492999329959143563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_51-7200627869575622663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_37_54-2942123046387646367?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_50-12229374201177060355?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_38-1557170838958782591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_50-17552748094327815221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_58-4996758202340701944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_51-5368040623527061552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_34-8393223518583644666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_50-4725002026144853497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_39_33-4583957468126638257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_29_49-14017625792271811509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_05_38_07-5534634617713989079?project=apache-beam-testing
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1144.874s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 134

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 47m 54s
74 actionable tasks: 57 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/bzv5e4mjytk6w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4772

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4772/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4771

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4771/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8370] update outdated Flink gradle command

[kirillkozlov] [BEAM-8343] Added nessesary methods to BeamSqlTable to enable support


------------------------------------------
[...truncated 194.18 KB...]
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerBatchTests-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2001.564s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.17.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:507: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  for spec in yaml.load_all(open(transcript_filename)):

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1310.972s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_20-14103552457677623924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_30_26-9582609584180950300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_13-5433403205139177809?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_29_16-8840632920370525613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_16-15694466083654813084?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_31_05-7270326283065726085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_16-12851236301832437559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_30_38-11792789303345831308?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_16-14562783482767345131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_30_10-1393315223690608164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_15-14258075880670957764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_31_04-10492503434540162369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_15-17829290656344226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_32_13-8191689535060878050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_21_14-3437175357063272207?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_30_42-11146600538216594944?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_06-6974732414613634302?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_58-8446770694639007668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_01-15562923741816177398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_24-4996631718451686339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_02-4769335244909658019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_36-10907689923385070576?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_02-14015485647967156657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_33_51-6519617532579841845?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_01-11032340446984527622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_05-17973009665909796945?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_03-12368094052388955841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_51-11456061775784066761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_02-16779750189608664510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_33_46-9640109463304715007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_25_01-17264686073639408839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_25-495352920384355878?project=apache-beam-testing
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1238.799s

OK

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_33-8890546551068496014?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_47-7845918232017315233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_29-7098073054478964972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_40-14670422823366029401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_27-16219820794119687613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_07-7857770760246776954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_29-11642293190915134882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_15-13414748092341959959?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_28-17862536016703115611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_38_44-7072375723576905606?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_29-1607581071371655041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_18-7484045515737225550?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_29-14361643404668681319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_56-10471025838696843131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_24_27-16243429902120506633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_47-14452764009166977285?project=apache-beam-testing
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1494.458s

OK

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_12-281674577854283662?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_59-16574648447731113442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_05-10174886874872913752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_23-17724758016148301450?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_06-15773334296705024193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_36_59-18126646007209910136?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_07-6920679908253484479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_30-6227024732962915220?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_06-2042762328787395244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_59-17721355546466347263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_05-8959624533289030349?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_34_19-7769944895837355684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_06-15123046440733468049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_35_59-8313772346745633169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_26_06-10789068308560343169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_09_39_20-1180112916557662753?project=apache-beam-testing
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1409.926s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 43s
74 actionable tasks: 57 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/sozzmrsm774cm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4770

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4770/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8369] update link to new container docs


------------------------------------------
[...truncated 296.98 KB...]
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s31"
        },
        "serialized_fn": "eNq9VOt31EQUT7JbWkIp0ioU8bGg6FZho/hCpAjdtjxWQg21O6g1TpLZTWhedzKh3XN2z1Fxe/pXeNRP/pnemd1SK8JHz5w87ut3Z+79zf2pUvdpTv2QuR6jSUNwmhadjCdFw884M5s0jqkXszanec74craamqAt/Az6AIw6qWia5nZSqPhBFMcNV75N1+eMCuZ2ytQXUYYB1fohe5zRwBW9nJkwQaYQopkFbB1lODKESQem6i29peFjtOaa5p7W157oXf2BBkftIZgLRMeQHTg2hGlS4K8VZgmzHrF0K0qL/e+lIqaPmbWd8a0Cz8cseTx3LStEM0uSSLhrPXfDcZepoJ0427YK7ltFsFVYeU+EWWr9oyjWQVEsWZRG3oPjat/XYpp4Ab0OM/fsiaYGJ4iBWqzHS0M4uSBg1oG5QyfvMuFSIbgJLysAr4xigbuFV8gkimiWVji1C6cdmD8UGiV5xoWbZEEZY+HOkLMY8ILWwatDOOvAayqPiyC+cF14fRfecOBNckQqGZQ0hpr9X83zGQpwLqzWw3E7JlrT2I6/hLaH7RjovYtCV3+G0GWDBpW+0a9sVfhlYQS6/O8Yp1D/i97W0rSqCUOgZsvkf+qaXDs3+tqytnllUO3N9PXfq/3qH7qutTWw0XcC/X4d+Uk0SYF9tIfoQfBpa32d//asNTWIFmhIlvM2OY2nXKVRzIIaLQrGxdXaBV5bXMQ3vLULb9dJFT3iqBBwQZWkwBKzAN4hcygsYVVvqrCVHZ/lksrwLjmKFsnVFc4zDnUVxlmSPWawQEwUNmhcjq3vCXh/5EF9IWt9kRxHge3kzMc8rsp8iZx4mtndN0FDeY6142hLkYTFLGGpgA8EfEjy/4X/rECSdq1SRLEk/+Ww1tpuntf0qSOGPqWWoc9OzuomfmdwGfq8XsU3fKTY9/RQHw/hE7wWnzrwWTgfniHz/6bwKFFDJoIrQ/jcgashUvYLB66FNTs8twmLQ7juwJdDuDGAm+SYpLGcJG4YpaKApcPDDA1K3wgYXgkqMl6Yd+7L7t2WahOaOMmW7QGs1BVUlOalUHgFrNpkGlVZKQ50t+xyF2572LQ7DtwdQsuBr4ZwbwB2PVwKJdh9BFurh6t2qHy/9kZbpLxbYA3kqHTCu6WABw6sK3yckjhhR2b4xlYdznnms6KAjXD9mRO2VRqCaR4epPnWK71N+G4A32/C5gvneTtKg2wba2zCD4jjDuDHuurRtjLg/ujz4kce5q0482g8wsEKeojiq0sheNTtMo4QwfMgxi7mMuvQMhbrYxEYgnTISXVP/DIpYyovm5x1DLotncxK+ChBdtAkd/0s8aKUcQjRpOoTFW4wgoRor/QEPGr8DQpmK34=",
        "user_name": "assert:even/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-10-10T14:23:07.018041Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-10-10_07_23_04-16336440167727060699'
 location: 'us-central1'
 name: 'beamapp-jenkins-1010142247-532163'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-10-10T14:23:07.018041Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-10-10_07_23_04-16336440167727060699]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-16336440167727060699?project=apache-beam-testing
root: INFO: Job 2019-10-10_07_23_04-16336440167727060699 is in state JOB_STATE_RUNNING
root: INFO: 2019-10-10T14:23:04.944Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-10-10_07_23_04-16336440167727060699. The number of workers will be between 1 and 1000.
root: INFO: 2019-10-10T14:23:04.944Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-10-10_07_23_04-16336440167727060699.
root: INFO: 2019-10-10T14:23:10.327Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:23:11.618Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-10-10T14:23:12.321Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-10-10T14:23:12.355Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert:even/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-10-10T14:23:12.392Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert:odd/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-10-10T14:23:12.427Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-10-10T14:23:12.469Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-10-10T14:23:12.509Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-10-10T14:23:12.627Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-10-10T14:23:12.686Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-10-10T14:23:12.725Z: JOB_MESSAGE_DETAILED: Unzipping flatten s8 for input s6.out
root: INFO: 2019-10-10T14:23:12.760Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-10-10T14:23:12.796Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-10-10T14:23:12.833Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-10-10T14:23:12.861Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-10-10T14:23:12.896Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-10-10T14:23:12.930Z: JOB_MESSAGE_DETAILED: Unzipping flatten s18 for input s16.out
root: INFO: 2019-10-10T14:23:12.956Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert:odd/Group/GroupByKey/Reify, through flatten assert:odd/Group/Flatten, into producer assert:odd/Group/pair_with_0
root: INFO: 2019-10-10T14:23:12.992Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/GroupByKey/GroupByWindow into assert:odd/Group/GroupByKey/Read
root: INFO: 2019-10-10T14:23:13.028Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/Map(_merge_tagged_vals_under_key) into assert:odd/Group/GroupByKey/GroupByWindow
root: INFO: 2019-10-10T14:23:13.055Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Unkey into assert:odd/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-10-10T14:23:13.095Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Match into assert:odd/Unkey
root: INFO: 2019-10-10T14:23:13.133Z: JOB_MESSAGE_DETAILED: Unzipping flatten s28 for input s26.out
root: INFO: 2019-10-10T14:23:13.172Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert:even/Group/GroupByKey/Reify, through flatten assert:even/Group/Flatten, into producer assert:even/Group/pair_with_0
root: INFO: 2019-10-10T14:23:13.208Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/GroupByKey/GroupByWindow into assert:even/Group/GroupByKey/Read
root: INFO: 2019-10-10T14:23:13.241Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/Map(_merge_tagged_vals_under_key) into assert:even/Group/GroupByKey/GroupByWindow
root: INFO: 2019-10-10T14:23:13.277Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Unkey into assert:even/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-10-10T14:23:13.313Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Match into assert:even/Unkey
root: INFO: 2019-10-10T14:23:13.345Z: JOB_MESSAGE_DETAILED: Unzipping flatten s8-u31 for input s9-reify-value18-c29
root: INFO: 2019-10-10T14:23:13.378Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.414Z: JOB_MESSAGE_DETAILED: Unzipping flatten s18-u36 for input s19-reify-value0-c34
root: INFO: 2019-10-10T14:23:13.450Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert:odd/Group/GroupByKey/Write, through flatten assert:odd/Group/Flatten/Unzipped-1, into producer assert:odd/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.486Z: JOB_MESSAGE_DETAILED: Unzipping flatten s28-u41 for input s29-reify-value9-c39
root: INFO: 2019-10-10T14:23:13.521Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert:even/Group/GroupByKey/Write, through flatten assert:even/Group/Flatten/Unzipped-1, into producer assert:even/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.557Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-10-10T14:23:13.585Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/GroupByKey/Reify into assert:odd/Group/pair_with_1
root: INFO: 2019-10-10T14:23:13.620Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/GroupByKey/Reify into assert:even/Group/pair_with_1
root: INFO: 2019-10-10T14:23:13.653Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.690Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/GroupByKey/Write into assert:odd/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.729Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/GroupByKey/Write into assert:even/Group/GroupByKey/Reify
root: INFO: 2019-10-10T14:23:13.766Z: JOB_MESSAGE_DETAILED: Fusing consumer ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>) into Some Numbers/Read
root: INFO: 2019-10-10T14:23:13.799Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>)
root: INFO: 2019-10-10T14:23:13.838Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-10-10T14:23:13.872Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-10-10T14:23:13.908Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>)
root: INFO: 2019-10-10T14:23:13.949Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/ToVoidKey into assert:odd/WindowInto(WindowIntoFn)
root: INFO: 2019-10-10T14:23:13.986Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/pair_with_1 into assert:odd/ToVoidKey
root: INFO: 2019-10-10T14:23:14.019Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>)
root: INFO: 2019-10-10T14:23:14.046Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/ToVoidKey into assert:even/WindowInto(WindowIntoFn)
root: INFO: 2019-10-10T14:23:14.069Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/pair_with_1 into assert:even/ToVoidKey
root: INFO: 2019-10-10T14:23:14.105Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-10-10T14:23:14.140Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:odd/Group/pair_with_0 into assert:odd/Create/Read
root: INFO: 2019-10-10T14:23:14.169Z: JOB_MESSAGE_DETAILED: Fusing consumer assert:even/Group/pair_with_0 into assert:even/Create/Read
root: INFO: 2019-10-10T14:23:14.206Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-10-10T14:23:14.239Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-10-10T14:23:14.277Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-10-10T14:23:14.311Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-10-10T14:23:14.475Z: JOB_MESSAGE_DEBUG: Executing wait step start57
root: INFO: 2019-10-10T14:23:14.548Z: JOB_MESSAGE_BASIC: Executing operation assert:odd/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.578Z: JOB_MESSAGE_BASIC: Executing operation assert:even/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.591Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-10-10T14:23:14.605Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.626Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-10-10T14:23:14.667Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.683Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.683Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-10-10T14:23:14.729Z: JOB_MESSAGE_DEBUG: Value "assert:odd/Group/GroupByKey/Session" materialized.
root: INFO: 2019-10-10T14:23:14.763Z: JOB_MESSAGE_DEBUG: Value "assert:even/Group/GroupByKey/Session" materialized.
root: INFO: 2019-10-10T14:23:14.788Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-10-10T14:23:14.820Z: JOB_MESSAGE_BASIC: Executing operation assert:odd/Create/Read+assert:odd/Group/pair_with_0+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write
root: INFO: 2019-10-10T14:23:14.856Z: JOB_MESSAGE_BASIC: Executing operation assert:even/Create/Read+assert:even/Group/pair_with_0+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write
root: INFO: 2019-10-10T14:23:14.897Z: JOB_MESSAGE_BASIC: Executing operation Some Numbers/Read+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/pair_with_1+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/pair_with_1+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write
root: INFO: 2019-10-10T14:23:14.936Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-10-10T14:23:40.524Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-10-10T14:29:14.445Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:35:14.455Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:41:14.454Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:47:14.454Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:53:14.454Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T14:59:14.455Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T15:05:14.457Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T15:11:14.456Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
root: INFO: 2019-10-10T15:17:14.456Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T15:23:14.456Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-10T15:23:14.822Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
root: INFO: 2019-10-10T15:23:14.895Z: JOB_MESSAGE_BASIC: Finished operation assert:odd/Create/Read+assert:odd/Group/pair_with_0+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write
root: INFO: 2019-10-10T15:23:14.895Z: JOB_MESSAGE_BASIC: Finished operation Some Numbers/Read+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:279>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/pair_with_1+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/pair_with_1+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write
root: INFO: 2019-10-10T15:23:14.895Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-10-10T15:23:14.896Z: JOB_MESSAGE_BASIC: Finished operation assert:even/Create/Read+assert:even/Group/pair_with_0+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write
root: INFO: 2019-10-10T15:23:15.204Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2019-10-10_07_23_04-16336440167727060699.
root: INFO: 2019-10-10T15:23:15.289Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-10-10T15:23:15.356Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-10-10T15:23:15.392Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-10-10T15:25:40.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-10-10T15:25:40.953Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-10-10T15:25:40.990Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-10-10_07_23_04-16336440167727060699 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_05-2371260237825550506?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_32_02-2707267825338710858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_07-16551845163086204364?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_31_42-6762821733323866554?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_42_15-5937479765915550398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-4355558609430891964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_33_08-4527080202167431011?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-16336440167727060699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-12975729655065551673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_33_02-17198093337311242583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-16345322397476017599?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_32_37-17488946819708116652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_42_10-14593772184878561437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_02-3183434350018679528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_32_35-1254682679733066885?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_42_17-14804505708108279229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_23_04-11228253588671285629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_07_33_07-7644650268165425249?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerBatchTests-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 3787.488s

FAILED (errors=1)

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests FAILED

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.17.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:507: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  for spec in yaml.load_all(open(transcript_filename)):
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_12-13580528083942573350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_35_24-8506454027841245995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_04-12362509679928117809?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_36_47-8406342940689003773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_05-8946641826558754260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_39_49-9599278490612661920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_05-12195681799663376982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_37_29-10779748406412199080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_05-11594148184376453197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_36_13-2058350838099920111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_05-13100779792870110641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_36_19-3365654112209598312?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_06-18359067382551128091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_34_44-10839528419395970815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_26_06-3468206610750788530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-10_08_36_08-14922442752865160562?project=apache-beam-testing
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1519.297s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 111

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 29m 25s
74 actionable tasks: 57 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/izizvizvzpwqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org