You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/11 19:38:57 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #4530

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4530/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-7947] Improves the interfaces of classes such as FnDataService,

[sunjincheng121] fix code style

[robinyqiu] Use a 0-length byte array to represent empty sketch in HllCount

------------------------------------------
[...truncated 174.42 KB...]
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 976.419s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_04-6013679027641815692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_07-7352991160066831108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_15_59-11993368493475186402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_26-18369847523063278128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_05-14130123362567063929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_22-2078625819515637585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_04-9172652746279710273?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_24-17014323103489094641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_04-2255612707668395389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_26-5119292766394893154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_04-13656513667026189292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_15-12725788149443364354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_05-7237458417337389702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_22-11723471934058116202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_16_02-12811130427249631304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_23_24-7127444477283620158?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_35-4099157205959451203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_18-12153413786905032435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_28-9141306514847692262?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_27_22-7086424534436480425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_31-16349507254949094555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_13-14689866635790327997?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_33-4151260739057629896?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_27-6957383887828793338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_35-6395583214833224507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_08-129313486087255870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_34-6582073488847094288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_22-7159759867121711420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_33-17366598481411506217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_29-16016608799863560225?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_19_34-12915926442847971645?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_28_11-6002805475824025336?project=apache-beam-testing
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1079.125s

OK

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_14-7408266676058613493?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_29_31-2561020869469456341?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_11-4100186995260360208?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_07-2402740910673866936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_16-7538636271406709586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_16-8200374325359613546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_07-14253165722507501700?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_20-7165388014517138894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_00-5583230348151421491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_19-11068222880970621504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_08-1656939436838088094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_16-17473052456178223927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_10-16933590843861733306?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_21_15-3161835499886667948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-11_12_30_17-6908846659528028374?project=apache-beam-testing
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ERROR
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

======================================================================
ERROR: test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/ptransform_test.py",> line 258, in test_par_do_with_multiple_outputs_and_using_return
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 485, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 531, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 561, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 491, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.5/subprocess.py", line 708, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967054/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1
 
 Pip install failed for package: -r         
 Output from execution of subprocess: b'Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))\n  ERROR: Could not find a version that satisfies the requirement pyhamcrest (from -r postcommit_requirements.txt (line 1)) (from versions: none)\nERROR: No matching distribution found for pyhamcrest (from -r postcommit_requirements.txt (line 1))\n'
-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911191958-265296.1568229598.265595/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911191958-265296.1568229598.265595/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911191958-265296.1568229598.265595/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0911191958-265296.1568229598.265595/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967054/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1155.134s

FAILED (errors=1)

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 101

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 38s
72 actionable tasks: 56 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/gwitvzkkjrk7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #4531

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/4531/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org