You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/03 17:58:57 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #1895

See <https://builds.apache.org/job/beam_PostCommit_Python35/1895/display/redirect?page=changes>

Changes:

[kamil.wasilewski] Add integration test for AnnotateImage transform

[kamil.wasilewski] Fix: skip test if GCP dependencies are not installed


------------------------------------------
[...truncated 10.03 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:50.918Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:50.955Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:50.968Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:50.988Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.010Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.029Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.057Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.072Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.089Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.089Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.108Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.117Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.154Z: JOB_MESSAGE_BASIC: Executing operation group/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.164Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.178Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.190Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.220Z: JOB_MESSAGE_BASIC: Finished operation group/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.227Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.252Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.264Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.298Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.303Z: JOB_MESSAGE_BASIC: Finished operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.331Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.366Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.411Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.456Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.489Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.521Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.554Z: JOB_MESSAGE_DEBUG: Value "read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.601Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2643>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:09:51.639Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:10:23.296Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:10:28.761Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:11:03.609Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:11:03.660Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.684Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2643>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.791Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.824Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.856Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.880Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.895Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.919Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.951Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.956Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:40.994Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:41.049Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:42.777Z: JOB_MESSAGE_BASIC: Finished operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:42.857Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:42.922Z: JOB_MESSAGE_BASIC: Finished operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:14:43.011Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:02.993Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:03.069Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:03.144Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:03.218Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:03.296Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey0.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.280Z: JOB_MESSAGE_BASIC: Finished operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.367Z: JOB_MESSAGE_BASIC: Executing operation group/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.423Z: JOB_MESSAGE_BASIC: Finished operation group/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.642Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.712Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.789Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.842Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.915Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(MapToVoidKey0.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:04.981Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:11.032Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:11.095Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:11.158Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:11.212Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:11.276Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey0.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:12.524Z: JOB_MESSAGE_BASIC: Finished operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:12.602Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:12.656Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:12.739Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract+write/Write/WriteImpl/PreFinalize/MapToVoidKey1+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1+write/Write/WriteImpl/PreFinalize/MapToVoidKey1+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.397Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract+write/Write/WriteImpl/PreFinalize/MapToVoidKey1+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1+write/Write/WriteImpl/PreFinalize/MapToVoidKey1+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.472Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.507Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.547Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.580Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.615Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:17.652Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:22.312Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:22.398Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:22.503Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:22.607Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:22.802Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:25.855Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:25.922Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:26.009Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:26.061Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:26.133Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:26.215Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:29.483Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/PreFinalize+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:29.564Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:29.645Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:29.718Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.582Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.666Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.733Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.800Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.874Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View.out0" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:31.940Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:33.481Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:33.565Z: JOB_MESSAGE_DEBUG: Executing success step success118
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:33.742Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:33.890Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:15:33.940Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:16:27.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:16:27.166Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:16:27.284Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_09_09_40-14755966577101583269 is in state JOB_STATE_DONE
apache_beam.testing.pipeline_verifiers: INFO: Wait 20 seconds...
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.04580378532409668 seconds.
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 4.594590619069006 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.04792332649230957 seconds.
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 5.2902561597669635 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.042261600494384766 seconds.
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 16.30543253364499 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.045571088790893555 seconds.
apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 30.537410591856975 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*-of-*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.04588127136230469 seconds.
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1583255352074/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1583255352074\\/results[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.03856086730957031 seconds.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_15-5386104353094451279?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_16_26-5414426480203431673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_25_05-8930310948730100260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_32_59-3885367571074685167?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_40_09-5619531579385068065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_10-14931827158329579244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_21_46-214188176001266972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_29_11-11794130588468680981?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_36_01-6481970002918026491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_42_49-10467693171822069430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_13-9820750283506503247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_14_07-14821830031734282898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_21_10-1567770250553482073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_28_27-16112040874332378021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_35_03-10701049986169276466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_42_00-4411441258337457581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_10-8386681171747343590?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_18_41-6298052407393543818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_25_44-16213904332700779378?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_32_25-10615885654580580324?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_39_34-15346323442232962765?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_46_43-4343705412558011859?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_10-2211284840965869433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_09_38-4698675814137685112?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_16_57-13966596080498589718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_24_37-11203964592574145436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_32_24-8407386889294595130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_39_35-3681960971029241332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_09-16034512536024947391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_09_40-14755966577101583269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_18_22-3825463984437838949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_25_39-18411685176325915617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_32_28-8934387008069657480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_40_35-8149799168076035462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_12-12925903616923164780?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_10_48-9134956951043140155?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_21_27-8407144557915622306?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_28_35-11456092320594423203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_45_26-6405821123828971499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_52_08-15663707691696010376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_02_13-6905674699668067797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_10_50-13068049884368604139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_17_21-6899442612239947039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_25_11-9687235316504936036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_32_43-12328431626107160860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_39_20-13459928498191713751?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_46_40-13979889967720747222?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 56 tests in 3432.345s

FAILED (SKIP=8, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 39s
84 actionable tasks: 63 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/sozytmry475ec

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #1896

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/1896/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org