You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/30 21:25:29 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #593

See <https://builds.apache.org/job/beam_PostCommit_Python2/593/display/redirect?page=changes>

Changes:

[lostluck] Helper to get the value of a KV type


------------------------------------------
[...truncated 691.73 KB...]
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:50.661Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-30T20:46:50.728Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
root: INFO: 2019-09-30T20:46:50.763Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
root: INFO: 2019-09-30T20:46:50.802Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-30T20:46:50.840Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-30T20:46:50.863Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-30T20:46:50.873Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-30T20:46:50.913Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-30T20:46:50.918Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-30T20:46:50.951Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
root: INFO: 2019-09-30T20:46:50.978Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
root: INFO: 2019-09-30T20:46:51.008Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten.out" materialized.
root: INFO: 2019-09-30T20:46:51.046Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-30T20:46:52.229Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:52.942Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
    def start(self):
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.scoped_start_state:
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
    with self.spec.source.reader() as reader:
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
    self.output(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 256, in apache_beam.runners.worker.operations.Operation.output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 143, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 593, in apache_beam.runners.worker.operations.DoOperation.process
    with self.scoped_process_state:
  File "apache_beam/runners/worker/operations.py", line 594, in apache_beam.runners.worker.operations.DoOperation.process
    delayed_application = self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 776, in apache_beam.runners.common.DoFnRunner.receive
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 782, in apache_beam.runners.common.DoFnRunner.process
    self._reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 849, in apache_beam.runners.common.DoFnRunner._reraise_augmented
    raise_with_traceback(new_exn)
  File "apache_beam/runners/common.py", line 780, in apache_beam.runners.common.DoFnRunner.process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.PerWindowInvoker.invoke_process
    self._invoke_process_per_window(
  File "apache_beam/runners/common.py", line 660, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
    windowed_value, self.process_method(*args_for_process))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 511, in process
    'BigQuery jobs failed. BQ error: %s', self._latest_error)
Exception: (u"BigQuery jobs failed. BQ error: %s [while running 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs']", <JobStatus
 errorResult: <ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>
 errors: [<ErrorProto
 message: u'Not found: Table apache-beam-testing:python_bq_file_loads_15698760558802.beam_load_2019_09_30_204544_65_6c04af14a3e8d5001e38f8ac52ffc19f_cc1a11225b5a46e480541c16c7ceda54 was not found in location US'
 reason: u'notFound'>]
 state: u'DONE'>)

root: INFO: 2019-09-30T20:46:52.969Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-30T20:46:53.040Z: JOB_MESSAGE_DEBUG: Executing failure step failure98
root: INFO: 2019-09-30T20:46:53.074Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S53:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.,
  beamapp-jenkins-093020405-09301341-ealp-harness-m54k
      Root cause: Work item failed.
root: INFO: 2019-09-30T20:46:53.599Z: JOB_MESSAGE_WARNING: S29:WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) failed.
root: INFO: 2019-09-30T20:46:53.636Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-30T20:46:53.749Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-30T20:46:54.379Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-30T20:46:54.407Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-30T20:49:20.976Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-30T20:49:21.014Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-30T20:49:21.048Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-30_13_41_11-18079351449406464958 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3560.654s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-11262061949655352600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_26-4144143286882013219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_41_13-15773746256158000405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_48_36-3812148319424550034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_55_43-12644389554144417012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_03_10-12400468502706147769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_10_26-479697534740292318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_17_54-15209654924793024956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_19-12127745356714157966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_44_57-4636402299800957687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_02_40-4243833746896103470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_24-2545215373072563376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_41_11-18079351449406464958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_49_45-3557387648186070139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_57_40-8269640652522417617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_05_15-16055213729348637551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_23-12183176833803058343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_39_04-8833938350517598946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_46_19-6215173368569923451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_52_57-2676182881689520548?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_00_00-2583388801769005969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-4162797980096061055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_43_50-2031423382545935332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_51_12-11165925138310668381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_58_39-6114012822787355110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_17-6504666919768180457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_16-13171232894320765256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_42_15-3033505486995659633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_48_33-11406389230598089828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_55_41-2950184232358624408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_19-6438056568745940327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_34_08-395964721375571668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_43_38-2805282141519175509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_50_58-1726937768829822425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_58_26-279067626545712361?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_26_18-16154200092724124292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_35_50-5446163863862887261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_45_17-7939372622562979530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_13_52_16-3022913811158294446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_14_00_07-13934941983528028110?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 22s
109 actionable tasks: 85 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/vpsbqr4z5kt52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #594

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/594/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org