You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/10/04 23:41:25 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #630

See <https://builds.apache.org/job/beam_PostCommit_Python2/630/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7389] Created elementwise for consistency with docs


------------------------------------------
[...truncated 1.08 MB...]
root: INFO: 2019-10-04T22:53:34.718Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
root: INFO: 2019-10-04T22:53:34.785Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-10-04T22:53:34.839Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-10-04T22:53:34.897Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
root: INFO: 2019-10-04T22:53:34.984Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-10-04T22:53:39.576Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-10-04T22:53:39.683Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
root: INFO: 2019-10-04T22:53:39.788Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-10-04T22:53:39.849Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-10-04T22:53:39.947Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
root: INFO: 2019-10-04T22:53:40.034Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-10-04T22:53:43.327Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-10-04T22:53:43.403Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-10-04T22:53:43.459Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-10-04T22:53:43.540Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-10-04T22:53:45.140Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-10-04T22:53:52.052Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-10-04T22:53:55.970Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-10-04T22:53:59.320Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-10-04T22:54:00.927Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15702292203294.beam_load_2019_10_04_225213_59_6e8b02fe891cf8212db3f5f085daf732_1f637604841049f885f82643da9c10da was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-10-04T22:54:00.982Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-10-04T22:54:01.080Z: JOB_MESSAGE_DEBUG: Executing failure step failure98
root: INFO: 2019-10-04T22:54:01.115Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S33:WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-100422470-10041547-htkh-harness-90wr
      Root cause: Work item failed.,
  beamapp-jenkins-100422470-10041547-htkh-harness-90wr
      Root cause: Work item failed.,
  beamapp-jenkins-100422470-10041547-htkh-harness-90wr
      Root cause: Work item failed.,
  beamapp-jenkins-100422470-10041547-htkh-harness-90wr
      Root cause: Work item failed.
root: INFO: 2019-10-04T22:54:01.268Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-10-04T22:54:01.337Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-10-04T22:54:01.365Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-10-04T22:58:13.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-10-04T22:58:13.242Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-10-04T22:58:13.294Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-10-04_15_47_13-6428841780301670682 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4245.730s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 58s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/m6pl2j35nojy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/631/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org