You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/19 20:51:28 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #2034

See <https://builds.apache.org/job/beam_PostCommit_Python35/2034/display/redirect?page=changes>

Changes:

[crites] Clean up of TestStreamTranscriptTests. Remvoes check for final field in

[crites] Adds clearing of pane info state when windows get merged away.

[iemejia] Move CHANGES template related items into template section


------------------------------------------
[...truncated 10.47 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.474Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.503Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.545Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.587Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber into WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.621Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.652Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:870>) into WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.689Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix into WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.720Z: JOB_MESSAGE_DETAILED: Fusing siblings WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs and WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.755Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.782Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.822Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.856Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.887Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.926Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.954Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:43.989Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.027Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.059Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.097Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.134Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.386Z: JOB_MESSAGE_DEBUG: Executing wait step start46
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.457Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteToBigQuery/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:870>)+WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.498Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.500Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.535Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.536Z: JOB_MESSAGE_BASIC: Executing operation CreateSchema/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.576Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.579Z: JOB_MESSAGE_BASIC: Finished operation CreateSchema/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.605Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.617Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.657Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.658Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.678Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.692Z: JOB_MESSAGE_DEBUG: Value "CreateSchema/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.713Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.720Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.753Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.787Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.830Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.863Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.866Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.897Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.901Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.930Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.934Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.962Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:44.966Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:45.006Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:45.045Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(Read.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:45.074Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:45.110Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(Read.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:05:57.436Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:06:10.109Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:07:59.585Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:07:59.617Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:44.361Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.655Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteToBigQuery/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:870>)+WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.744Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.785Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:870>).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.824Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.874Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.907Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.939Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.961Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.971Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.975Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:49.993Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.009Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.036Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.045Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.065Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.069Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.094Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:870>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.127Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.162Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:11:50.236Z: JOB_MESSAGE_BASIC: Executing operation CreateInput/Read+WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:07.767Z: JOB_MESSAGE_BASIC: Finished operation CreateInput/Read+WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:07.843Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:07.904Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:07.963Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:11.058Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:11.132Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:11.192Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:11.292Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:31.678Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/nativeavroio.py", line 309, in __exit__
    self._data_file_writer.fo.close()
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/filesystemio.py", line 219, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/gcsio.py", line 615, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/gcsio.py", line 590, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1156, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AEnB2UpThHnHPSROJFCd7fLDaGiEQlhwJXXEPOsWxDbICYkapFLAJo2c3k9PtmRUpi5-FVs9Y13h6Fmamx2TsFtHFuLUxeSfQaLGKY8G5iIMP8r6FFcE_T4&name=temp-it%2Fbeamapp-jenkins-0319200518-663025.1584648318.663210%2Fdax-tmp-2020-03-19_13_05_36-3442848243926024158-S17-2-47146c84e443c25%2Ftmp-47146c84e4430ec-shard--try-73e4cb8ce357d88c-endshard.avro>: response: <{'status': '503', 'content-length': '0', 'date': 'Thu, 19 Mar 2020 20:12:29 GMT', 'x-guploader-uploadid': 'AEnB2UpThHnHPSROJFCd7fLDaGiEQlhwJXXEPOsWxDbICYkapFLAJo2c3k9PtmRUpi5-FVs9Y13h6Fmamx2TsFtHFuLUxeSfQaLGKY8G5iIMP8r6FFcE_T4', 'server': 'UploadServer', 'content-type': 'text/plain; charset=utf-8'}>, content <>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.014Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.090Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.129Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.171Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.207Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.243Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.277Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.294Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.310Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.312Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.332Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.371Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.384Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.425Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.460Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.498Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/Flatten.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:47.531Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.457Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.539Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.619Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.679Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.741Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:52.800Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:55.631Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:55.711Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:55.780Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:55.848Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:58.759Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:58.839Z: JOB_MESSAGE_DEBUG: Executing success step success44
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:58.962Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:59.166Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:12:59.202Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:14:22.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:14:22.058Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-19T20:14:22.103Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-19_13_05_36-3442848243926024158 is in state JOB_STATE_DONE
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query SELECT name, value, timestamp FROM python_bq_file_loads_15846483167365.output_table WHERE value<0 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/648e01e1-c379-41b5-a9ab-715be72d78a8?maxResults=0&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon8150ea1a1e3ad60ab9193135d50585b2dd506c1a/data HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Result of query is: [('Negative infinity', -inf, datetime.datetime(1970, 1, 1, 0, 0, tzinfo=<UTC>)), ('Negative infinity', -inf, datetime.datetime(1970, 1, 1, 0, 0, tzinfo=<UTC>))]
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset python_bq_file_loads_15846483167365 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_33-4303855501113841491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_05_36-3442848243926024158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_15_07-14979949747714940423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_24_05-12656130266357952046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_33_17-6030083586296331420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_39-9355879950770049347?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_57_31-6678721762310469269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_07_31-9414166745105416349?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_16_46-3755000684367214806?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_25_56-13995874568332412744?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_34_39-10260215430571477963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_38-12858870207049391539?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_56_16-13129672208159072685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_04_44-16202950240270913419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_13_54-10792109589449214530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_21_57-6222596552596732350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_30_28-2935160164342765504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_33-3498951537567074872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_02_26-10101939379036478058?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_12_42-15111192023807715610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_20_53-15003412773280310930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_30_07-43299847561856882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_35-10201449951749028226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_52_46-10191282397800256117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_00_55-5701191429395296111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_09_27-1472603445701118?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_18_22-11135685177152082335?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_26_42-13771272414156689567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_35_41-11708811362246950742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_33-14611508013096969702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_51_06-12038964113354761181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_02_23-12168844007456961213?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_10_23-17067855207351686692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_19_11-1305679569632915930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_28_02-2136545972304302779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_37_01-15168245512141248484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_34-2225221772017675470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_53_41-2333259180851632868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_06_33-8366976712006503578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_16_19-2734306682754211270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_33_29-7950777463892944350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_43_36-8835511805035835434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_12_51_48-227565425955753927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_00_50-702704346218368199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_09_13-7915490288706124353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_18_19-2307328072267331367?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_26_11-17544867167090639467?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_33_30-16674254080623549885?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-19_13_42_51-10425784568815414615?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 4097.931s

FAILED (SKIP=8, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 3s
86 actionable tasks: 65 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/2t7vqxgdedi4g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #2035

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2035/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org