You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/27 17:52:11 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #326

See <https://builds.apache.org/job/beam_PostCommit_Python2/326/display/redirect?page=changes>

Changes:

[katarzyna.kucharczyk] [BEAM-7528] Save metrics according to distribution name.

[katarzyna.kucharczyk] [BEAM-7528] Refactor of load test utils and added documentation

[kamil.wasilewski] [BEAM-7872] Implement Flink class which takes responsibility of creating

[kamil.wasilewski] [BEAM-7872] Rewrite create_flink_cluster.sh so that it supports other

[altay] Removed unused arg from ToString.Element

[kamil.wasilewski] [BEAM-7872] Create a class for publishing docker images

------------------------------------------
[...truncated 1.08 MB...]
  File "apache_beam/runners/common.py", line 683, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-08-27T16:58:18.376Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
root: INFO: 2019-08-27T16:58:18.445Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-08-27T16:58:18.507Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-08-27T16:58:18.586Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
root: INFO: 2019-08-27T16:58:18.666Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-08-27T16:58:19.462Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-08-27T16:58:22.710Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-08-27T16:58:22.788Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-08-27T16:58:22.880Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-08-27T16:58:22.952Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-08-27T16:58:26.316Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 799, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 805, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 872, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 803, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 610, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 683, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-08-27T16:58:34.551Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 799, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 805, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 872, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 803, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 610, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 683, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-08-27T16:58:37.309Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 799, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 805, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 872, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 803, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 610, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 683, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15669247195391.beam_load_2019_08_27_165656_53_51145c1467a968b3939abb483c2d1111_aa65b881b3744efa96964684b9d49339 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-08-27T16:58:37.341Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-08-27T16:58:37.395Z: JOB_MESSAGE_DEBUG: Executing failure step failure98
root: INFO: 2019-08-27T16:58:37.426Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S33:WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-082716520-08270952-8x7d-harness-4386,
  beamapp-jenkins-082716520-08270952-8x7d-harness-4386,
  beamapp-jenkins-082716520-08270952-8x7d-harness-4386,
  beamapp-jenkins-082716520-08270952-8x7d-harness-4386
root: INFO: 2019-08-27T16:58:37.524Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-08-27T16:58:37.631Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-08-27T16:58:37.701Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-08-27T16:58:37.742Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-08-27T17:03:25.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-08-27T17:03:25.429Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-08-27T17:03:25.475Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-08-27_09_52_10-15996498429464716535 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4537.904s

FAILED (SKIP=4, errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_49-4325098822697995618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_52_10-15996498429464716535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_03_49-3192032655313167940?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_13_33-3049184827737588253?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-5296781552316198289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_57_18-14842716271951183605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_17_47-2801935094650477507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_48-13133680592153202941?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_50_49-5387344137734929438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_58_33-4592188799737374622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_09_01-981332270187856647?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_18_30-658588750194567831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-906518848973255064?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_56_03-11755027453072541457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_04_49-4824408498336127569?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_13_21-13505106410279006661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_22_07-1666149548457563976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_33_19-16229500398143494099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_42_11-15830451730223530485?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-8024950501908776096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_45_17-11278483484095048114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_57_01-2183018514998525976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_07_50-8572354811386934753?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_17_47-15154719287731464096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-14602638493614340630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_45_14-5534652326631907302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_56_38-18302520043884906513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_07_59-1185548940010384525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_18_35-8778320101531885249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-15929200218201208560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_46_28-17341133216966916209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_56_30-6023121601038533677?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_04_52-15126284132599213803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_14_10-8317483760050929296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_23_26-10203213483461045046?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_36_45-17645982385402508423?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_46_41-7635200073916180787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_09_58_53-7237251974910700450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_08_56-1551760605207421862?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-08-27_10_18_40-3166076060547174069?project=apache-beam-testing.

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 41s
110 actionable tasks: 85 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/gim7yp4f2bizg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #327

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/327/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org