You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/11/11 02:56:05 UTC

beam_PostCommit_Python_Examples_Dataflow - Build # 1107 - Aborted!

beam_PostCommit_Python_Examples_Dataflow - Build # 1107 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1107/ to view the results.

Jenkins build is back to normal : beam_PostCommit_Python_Examples_Dataflow #1112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1112/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python_Examples_Dataflow - Build # 1111 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python_Examples_Dataflow - Build # 1111 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1111/ to view the results.

beam_PostCommit_Python_Examples_Dataflow - Build # 1110 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python_Examples_Dataflow - Build # 1110 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1110/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python_Examples_Dataflow #1109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1109/display/redirect>

Changes:


------------------------------------------
[...truncated 152.90 KB...]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.553Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal into WriteTeamScoreSums/ConvertToRow
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.587Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.620Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.642Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.683Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.711Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.746Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.768Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.800Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3498>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.828Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3498>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.887Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.928Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.949Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:17.979Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.175Z: JOB_MESSAGE_DEBUG: Executing wait step start61
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.240Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3498>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.264Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.276Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.297Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3498>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.309Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.320Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.355Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.368Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.385Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.405Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.410Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.429Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.444Z: JOB_MESSAGE_DEBUG: Value "HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.464Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.478Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.511Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:18.546Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:57:52.427Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:58:00.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T13:58:19.557Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:56.319Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3498>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:57.853Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3498>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:57.910Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix.None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:57.932Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix.None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:57.964Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix.None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:57.995Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix.None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.026Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.049Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.078Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.082Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.096Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.102Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.135Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.140Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.154Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.167Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.186Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.202Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.222Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.225Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile.out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.256Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.289Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.321Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:04:58.354Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:00.532Z: JOB_MESSAGE_BASIC: Finished operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:00.583Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:00.657Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+HourlyTeamScore/ParseGameEventFn+HourlyTeamScore/FilterStartTime+HourlyTeamScore/FilterEndTime+HourlyTeamScore/AddEventTimestamps+HourlyTeamScore/FixedWindowsTeam+HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>)+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:12.193Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+HourlyTeamScore/ParseGameEventFn+HourlyTeamScore/FilterStartTime+HourlyTeamScore/FilterEndTime+HourlyTeamScore/AddEventTimestamps+HourlyTeamScore/FixedWindowsTeam+HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>)+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:12.250Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:12.290Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:12.350Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:14.028Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:14.086Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:14.125Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:14.185Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:15.324Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:15.375Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:15.413Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:15.471Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.026Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.089Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.122Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).OngoingJobs" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.156Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.184Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.217Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs).OngoingJobs" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.250Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input1-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.288Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.292Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input1-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.346Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input1-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs).out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.371Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.399Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:30.420Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten.None" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.451Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.501Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs).TriggerDeleteTempTables" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.554Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.600Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.656Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue/View-python_side_input0-WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue.out" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:33.770Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:35.072Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:35.131Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:35.183Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:35.225Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:36.319Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:36.362Z: JOB_MESSAGE_DEBUG: Executing success step success59
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:36.432Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:36.470Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:05:36.493Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:07:51.549Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:07:51.591Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:07:51.621Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-11_05_57_11-1726328336831336532 is in state JOB_STATE_DONE
INFO     apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to perform query SELECT COUNT(*) FROM `apache-beam-testing.hourly_team_score_it_dataset16681750255630.leader_board` to BQ
INFO     apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:96 Read from given query (SELECT COUNT(*) FROM `apache-beam-testing.hourly_team_score_it_dataset16681750255630.leader_board`), total rows 1
INFO     apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:101 Generate checksum: 91143e81622aa391eb62eaa3f3a5123401edb07d
PASSED
apache_beam/examples/complete/game/user_score_it_test.py::UserScoreIT::test_userscore_output_checksum_on_small_input 
-------------------------------- live log call ---------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:778 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/2050596098/bin/python3.10',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpvg9vpplj/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp310', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:484 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.10_sdk:2.44.0.dev
INFO     root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221021
INFO     root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221021" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f23bcfc1c60> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f23bcfc2440> ====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/requirements.txt...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/requirements.txt in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/pickled_main_session...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/pickled_main_session in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/mock-2.0.0-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/seaborn-0.12.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/seaborn-0.12.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/beautifulsoup4-4.11.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/matplotlib-3.6.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/matplotlib-3.6.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1111140810-271448-j1pmc5z6.1668175690.271601/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
 clientRequestId: '20221111140810272473-8399'
 createTime: '2022-11-11T14:08:14.222702Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-11-11_06_08_13-5817533785343965422'
 location: 'us-central1'
 name: 'beamapp-jenkins-1111140810-271448-j1pmc5z6'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-11-11T14:08:14.222702Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-11-11_06_08_13-5817533785343965422]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-11-11_06_08_13-5817533785343965422
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:915 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-11_06_08_13-5817533785343965422?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-11_06_08_13-5817533785343965422?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: 
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-11_06_08_13-5817533785343965422?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-11_06_08_13-5817533785343965422 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:14.691Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-11-11_06_08_13-5817533785343965422. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:14.724Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-11-11_06_08_13-5817533785343965422.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:16.947Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.178Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.210Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.288Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.326Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteUserScoreSums/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.404Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.436Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.481Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.527Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/InitializeWrite into WriteUserScoreSums/Write/WriteImpl/DoOnce/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.569Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3498>) into WriteUserScoreSums/Write/WriteImpl/DoOnce/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.608Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/DoOnce/Map(decode) into WriteUserScoreSums/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3498>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.642Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadInputText/Read/Map(<lambda at iobase.py:908>) into ReadInputText/Read/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.676Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction into ReadInputText/Read/Map(<lambda at iobase.py:908>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.712Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.747Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ParseGameEventFn into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.780Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:136>) into UserScore/ParseGameEventFn
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.802Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial into UserScore/ExtractAndSumScore/Map(<lambda at user_score.py:136>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.830Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.867Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine into UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.891Z: JOB_MESSAGE_DETAILED: Fusing consumer UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract into UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.930Z: JOB_MESSAGE_DETAILED: Fusing consumer FormatUserScoreSums into UserScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.962Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/WindowInto(WindowIntoFn) into FormatUserScoreSums
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:19.998Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/WriteBundles into WriteUserScoreSums/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.032Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/Pair into WriteUserScoreSums/Write/WriteImpl/WriteBundles
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.071Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/Write into WriteUserScoreSums/Write/WriteImpl/Pair
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.106Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/Extract into WriteUserScoreSums/Write/WriteImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.168Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.198Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.227Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.263Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.444Z: JOB_MESSAGE_DEBUG: Executing wait step start34
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.518Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.555Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/DoOnce/Impulse+WriteUserScoreSums/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3498>)+WriteUserScoreSums/Write/WriteImpl/DoOnce/Map(decode)+WriteUserScoreSums/Write/WriteImpl/InitializeWrite
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.567Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.587Z: JOB_MESSAGE_BASIC: Executing operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.598Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.621Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.691Z: JOB_MESSAGE_BASIC: Finished operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.691Z: JOB_MESSAGE_BASIC: Finished operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.765Z: JOB_MESSAGE_DEBUG: Value "WriteUserScoreSums/Write/WriteImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.799Z: JOB_MESSAGE_DEBUG: Value "UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:29.758Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete

The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=1601b98d-58f5-4894-8351-e2af5766d951, currentDir=<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 2703019
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-2703019.out.log
----- Last  20 lines from daemon log file - daemon-2703019.out.log -----
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.071Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/GroupByKey/Write into WriteUserScoreSums/Write/WriteImpl/Pair
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.106Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/Write/WriteImpl/Extract into WriteUserScoreSums/Write/WriteImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.168Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.198Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.227Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.263Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.444Z: JOB_MESSAGE_DEBUG: Executing wait step start34
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.518Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.555Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/DoOnce/Impulse+WriteUserScoreSums/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3498>)+WriteUserScoreSums/Write/WriteImpl/DoOnce/Map(decode)+WriteUserScoreSums/Write/WriteImpl/InitializeWrite
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.567Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.587Z: JOB_MESSAGE_BASIC: Executing operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.598Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.621Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.691Z: JOB_MESSAGE_BASIC: Finished operation WriteUserScoreSums/Write/WriteImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.691Z: JOB_MESSAGE_BASIC: Finished operation UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.765Z: JOB_MESSAGE_DEBUG: Value "WriteUserScoreSums/Write/WriteImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:20.799Z: JOB_MESSAGE_DEBUG: Value "UserScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-11T14:08:29.758Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Terminated
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python_Examples_Dataflow - Build # 1108 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python_Examples_Dataflow - Build # 1108 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1108/ to view the results.