You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/05/12 00:11:06 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #3876

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3876/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Set 5 second timeout for

[noreply] [BEAM-11055] Don't add log4j to library (#14777)

[noreply] [BEAM-3713] Move dataflow:validatesContainerTests from nosetest to

[noreply] [BEAM-11777][BEAM-11978] Add support for all kwargs in DataFrame, Series

[Brian Hulette] Add DeferredFrameTest._run_inplace_test

[Brian Hulette] Add DataFrame.insert implementation


------------------------------------------
[...truncated 8.94 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7f1001e4e1e0> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:58 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupShardedRows_35\n  write/BigQueryBatchFileLoads/GroupShardedRows:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DestinationFilesUnion_38\n  write/BigQueryBatchFileLoads/DestinationFilesUnion:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupFilesByTableDestinations_40\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-DeduplicateTables_72\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-Flatten_83\n  write/BigQueryBatchFileLoads/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f1001e4e620> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:58 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupShardedRows_35\n  write/BigQueryBatchFileLoads/GroupShardedRows:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DestinationFilesUnion_38\n  write/BigQueryBatchFileLoads/DestinationFilesUnion:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupFilesByTableDestinations_40\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-DeduplicateTables_72\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-Flatten_83\n  write/BigQueryBatchFileLoads/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f1001e4e730> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:58 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupShardedRows_35\n  write/BigQueryBatchFileLoads/GroupShardedRows:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DestinationFilesUnion_38\n  write/BigQueryBatchFileLoads/DestinationFilesUnion:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupFilesByTableDestinations_40\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-DeduplicateTables_72\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-Flatten_83\n  write/BigQueryBatchFileLoads/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7f1001e4e8c8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:58 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-GroupByKey_10\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupShardedRows_35\n  write/BigQueryBatchFileLoads/GroupShardedRows:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DestinationFilesUnion_38\n  write/BigQueryBatchFileLoads/DestinationFilesUnion:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GroupFilesByTableDestinations_40\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-DeduplicateTables_72\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-Flatten_83\n  write/BigQueryBatchFileLoads/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7f1001e4e950> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:62 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\n  must follow: create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'write/BigQueryBatchFileLoads/GroupShardedRows/Write\n  write/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/GroupShardedRows/Read\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/GroupShardedRows/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DestinationFilesUnion_38\n  write/BigQueryBatchFileLoads/DestinationFilesUnion:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-Flatten_83\n  write/BigQueryBatchFileLoads/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7f1001e4ea60> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:68 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_create-Impulse_3\n  create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7\n  create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\n  must follow: create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12\n  create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_create-Map-decode-_13\n  create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26\n  write/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27\n  write/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28\n  write/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_17, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29\n  write/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_18, ref_PCollection_PCollection_40, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30\n  write/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31\n  write/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33\n  write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34\n  write/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'write/BigQueryBatchFileLoads/GroupShardedRows/Write\n  write/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/GroupShardedRows/Read\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/GroupShardedRows/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36\n  write/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37\n  write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/DestinationFilesUnion/Read\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0, write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39\n  write/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42\n  write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50\n  write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51\n  write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57\n  write/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58\n  write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64\n  write/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_49', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70\n  write/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71\n  write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74\n  write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75\n  write/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76\n  write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82\n  write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Transcode/0\n  write/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Transcode/1\n  write/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Write/0\n  write/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Write/1\n  write/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: write/BigQueryBatchFileLoads/Flatten/Write/1, write/BigQueryBatchFileLoads/Flatten/Write/0\n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7f1001e4eae8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 4, 7, 3, 5, 5, 3, 4, 1, 11, 10, 11, 6, 3, 5]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/Impulse:beam:transform:impulse:v1\ncreate/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7f1001e4eb70> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 4, 7, 3, 5, 5, 3, 4, 1, 11, 10, 11, 6, 3, 5]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:transform:impulse:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/Impulse:beam:transform:impulse:v1\ncreate/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7f1001e4ebf8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 4, 7, 3, 5, 5, 3, 4, 1, 11, 10, 11, 6, 3, 5]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:runner:source:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\ncreate/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f1001e4ee18> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 5, 11, 10, 4, 3, 11, 7, 5, 3, 5, 6, 3, 4, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\ncreate/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:runner:source:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7f1001e4ed90> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 5, 11, 10, 4, 3, 11, 7, 5, 3, 5, 6, 3, 4, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\ncreate/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:runner:source:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7f1001e4eea0> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:15 [3, 5, 11, 10, 4, 3, 11, 7, 5, 3, 5, 6, 3, 4, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)\n  write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n  create/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:runner:sink:v1\ncreate/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/LoadJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/SchemaModJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/CopyJobNamePrefix:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GenerateFilePrefix:beam:transform:pardo:v1\nref_PCollection_PCollection_15/Write:beam:runner:sink:v1\nref_PCollection_PCollection_16/Write:beam:runner:sink:v1\nref_PCollection_PCollection_17/Write:beam:runner:sink:v1\nref_PCollection_PCollection_18/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse:beam:runner:source:v1\n  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_17, ref_PCollection_PCollection_40, ref_PCollection_PCollection_16, ref_PCollection_PCollection_18, ref_PCollection_PCollection_44, ref_PCollection_PCollection_35, ref_PCollection_PCollection_49, ref_PCollection_PCollection_15', '(((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:runner:source:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RewindowIntoGlobal:beam:transform:window_into:v1\nwrite/BigQueryBatchFileLoads/AppendDestination:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ParDo(_ShardDestinations):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupShardedRows/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)\n  write/BigQueryBatchFileLoads/GroupShardedRows/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/DropShardNumber:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WriteGroupedRecordsToFile:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0:beam:runner:sink:v1\n  must follow: (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)\n  write/BigQueryBatchFileLoads/DestinationFilesUnion/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/IdentityWorkaround:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write:beam:runner:sink:v1\n  must follow: (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0), (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables:beam:transform:pardo:v1\nref_PCollection_PCollection_35/Write:beam:runner:sink:v1\nref_PCollection_PCollection_34/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/1:beam:transform:flatten:v1\nref_PCollection_PCollection_58/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Transcode/0:beam:transform:flatten:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/0:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/Flatten/Write/1:beam:runner:sink:v1\n  must follow: ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_58, ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_34, ref_PCollection_PCollection_35, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForTempTableLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema):beam:transform:pardo:v1\nref_PCollection_PCollection_39/Write:beam:runner:sink:v1\nref_PCollection_PCollection_40/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_40, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForSchemaModJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_44/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_44, ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  ref_PCollection_PCollection_39/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs):beam:transform:pardo:v1\nref_PCollection_PCollection_45/Write:beam:runner:sink:v1\n  must follow: ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Impulse_53)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-FlatMap-lambda-at-cor_54))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorSchemaModJobs-Map-decode-_56))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForSchemaModJobs_57))+(ref_PCollection_PCollection_44/Write), ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write), ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_45, ref_PCollection_PCollection_49', '((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForCopyJobs:beam:transform:pardo:v1\nref_PCollection_PCollection_49/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Impulse:beam:runner:source:v1\n  must follow: ((ref_PCollection_PCollection_39/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-TriggerCopyJobs-_58))+(ref_PCollection_PCollection_45/Write)\n  downstream_side_inputs: ref_PCollection_PCollection_49', '(((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/PassTables:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write:beam:runner:sink:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1), ((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Impulse_60)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-FlatMap-lambda-at-core-py-_61))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorCopyJobs-Map-decode-_63))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForCopyJobs_64))+(ref_PCollection_PCollection_49/Write)\n  downstream_side_inputs: ', '((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-GetTableNames-Keys_74))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Delete_75)\n  write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read:beam:runner:source:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/RemoveTempTables/Delete:beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Impulse_66)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-FlatMap-lambda-at-core-p_67))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-Impulse-Map-decode-_69))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-PassTables_70))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RemoveTempTables-AddUselessValue_71))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)\n  downstream_side_inputs: ', '(((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Impulse_78)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-FlatMap-lambda-_79))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorDestinationLoadJobs-Map-decode-_81))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForDestinationLoadJobs_82)\n  write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/FlatMap(<lambda at core.py:2930>):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Map(decode):beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/WaitForDestinationLoadJobs:beam:transform:pardo:v1\nwrite/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Impulse:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ', 'write/BigQueryBatchFileLoads/Flatten/Read\n  write/BigQueryBatchFileLoads/Flatten/Read:beam:runner:source:v1\n  must follow: ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)\n  downstream_side_inputs: ']
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7f0ff1701dd8> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Impulse_17)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-FlatMap-lambda-at-core-py-2930-_18))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseEmptyPC-Map-decode-_20)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_46 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_create-Impulse_3)+(ref_AppliedPTransform_create-FlatMap-lambda-at-core-py-2930-_4))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-AddRandomKeys_7))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/MaybeReshuffle/Reshuffle/AddRandomKeys output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[create/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation create/Impulse receivers=[SingletonConsumerSet[create/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation create/Impulse receivers=[SingletonConsumerSet[create/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[create/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/MaybeReshuffle/Reshuffle/AddRandomKeys output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_47 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Impulse_22)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-FlatMap-lambda-at-core-py-_23))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseSingleElementPC-Map-decode-_25))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-LoadJobNamePrefix_26))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-SchemaModJobNamePrefix_27))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-CopyJobNamePrefix_28))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-GenerateFilePrefix_29))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_15/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_18/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_16/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_17/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/LoadJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/LoadJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/SchemaModJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/CopyJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/CopyJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/GenerateFilePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GenerateFilePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=4]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=4]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/GenerateFilePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GenerateFilePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/CopyJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/CopyJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/SchemaModJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/LoadJobNamePrefix output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/LoadJobNamePrefix.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_17/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_16/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_18/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_15/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_48 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_11))+(ref_AppliedPTransform_create-MaybeReshuffle-Reshuffle-RemoveRandomKeys_12))+(ref_AppliedPTransform_create-Map-decode-_13))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-RewindowIntoGlobal_30))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-AppendDestination_31))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-WriteRecordsToFile-ParDo-WriteRecordsToFile_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-_ShardDestinations-_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/GroupShardedRows/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(_ShardDestinations).out0, coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[DeterministicFastPrimitivesCoder]], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) output_tags=['UnwrittenRecords', 'WrittenFiles', 'None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], ConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out2, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/AppendDestination output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/RewindowIntoGlobal output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RewindowIntoGlobal.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/Map(decode) output_tags=['None'], receivers=[SingletonConsumerSet[create/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/MaybeReshuffle/Reshuffle/RemoveRandomKeys output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/MaybeReshuffle/Reshuffle/RemoveRandomKeys output_tags=['None'], receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation create/Map(decode) output_tags=['None'], receivers=[SingletonConsumerSet[create/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/RewindowIntoGlobal output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RewindowIntoGlobal.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/AppendDestination output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) output_tags=['UnwrittenRecords', 'WrittenFiles', 'None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], ConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out2, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(_ShardDestinations).out0, coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[DeterministicFastPrimitivesCoder]], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/GroupShardedRows/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1 >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_49 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-DropShardNumber_36))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/DropShardNumber output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DropShardNumber.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, IterableCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/GroupShardedRows/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupShardedRows/Read.out0, coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[DeterministicFastPrimitivesCoder]], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation write/BigQueryBatchFileLoads/GroupShardedRows/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupShardedRows/Read.out0, coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[DeterministicFastPrimitivesCoder]], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/DropShardNumber output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DropShardNumber.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, IterableCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0 >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_50 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/IdentityWorkaround output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/IdentityWorkaround.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DestinationFilesUnion/Read.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation write/BigQueryBatchFileLoads/DestinationFilesUnion/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DestinationFilesUnion/Read.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/IdentityWorkaround output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/IdentityWorkaround.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_51 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-PartitionFiles-ParDo-PartitionFiles-_42))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithTempTables-ParDo-TriggerLoadJo_44))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-TriggerLoadJobsWithoutTempTables_76))+(ref_PCollection_PCollection_35/Write))+(ref_PCollection_PCollection_34/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_58/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/Flatten/Write/1 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation write/BigQueryBatchFileLoads/Flatten/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_58/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_34/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_35/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/1 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/0 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/0.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) output_tags=['None', 'TemporaryTables'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=2], SingletonConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out1, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata:80
DEBUG:urllib3.connectionpool:http://metadata:80 "GET /computeMetadata/v1/instance/attributes/job_id HTTP/1.1" 404 1606
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=2]]>
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata:80
DEBUG:urllib3.connectionpool:http://metadata:80 "GET /computeMetadata/v1/instance/attributes/job_id HTTP/1.1" 404 1606
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) output_tags=['SINGLE_PARTITION', 'MULTIPLE_PARTITIONS', 'None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], ConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out2, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.io.gcp.bigquery_file_loads:Load job has 1 files. Job name is beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823.
INFO:apache_beam.io.gcp.bigquery_file_loads:Triggering job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823 to load data to BigQuery table <TableReference
 datasetId: 'python_write_to_table_16207782606105'
 projectId: 'apache-beam-testing'
 tableId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823'>.Schema: {'fields': [{'name': 'int64', 'type': 'INT64'}, {'name': 'bool', 'type': 'BOOL'}]}. Additional parameters: {'schemaUpdateOptions': ['ALLOW_FIELD_ADDITION']}
INFO:apache_beam.io.gcp.bigquery_tools:Stated BigQuery job: <JobReference
 jobId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823'
 location: 'US'
 projectId: 'apache-beam-testing'>
 bq show -j --format=prettyjson --project_id=apache-beam-testing beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823
DEBUG:apache_beam.runners.worker.operations:Processing [(('apache-beam-testing:python_write_to_table_16207782606105.python_append_schema_update', <JobReference
 jobId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_4cc857749a25406489b28be3abf73823'
 location: 'US'
 projectId: 'apache-beam-testing'>), 9223371950454.773, (GlobalWindow,), PaneInfo(first: True, last: True, timing: ON_TIME, index: 0, nonspeculative_index: 0))] in <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/1 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.io.gcp.bigquery_file_loads:Load job has 1 files. Job name is beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d.
INFO:apache_beam.io.gcp.bigquery_file_loads:Triggering job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d to load data to BigQuery table <TableReference
 datasetId: 'python_write_to_table_16207782606105'
 projectId: 'apache-beam-testing'
 tableId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d'>.Schema: {'fields': [{'name': 'int64', 'type': 'INT64'}, {'name': 'bool', 'type': 'BOOL'}]}. Additional parameters: {'schemaUpdateOptions': ['ALLOW_FIELD_ADDITION']}
INFO:apache_beam.io.gcp.bigquery_tools:Stated BigQuery job: <JobReference
 jobId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d'
 location: 'US'
 projectId: 'apache-beam-testing'>
 bq show -j --format=prettyjson --project_id=apache-beam-testing beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d
DEBUG:apache_beam.runners.worker.operations:Processing [(('apache-beam-testing:python_write_to_table_16207782606105.python_append_schema_update', <JobReference
 jobId: 'beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_198_6c2b32328541758daadec34a69ec6423_3828aa9119714a43a0ea9d92e811795d'
 location: 'US'
 projectId: 'apache-beam-testing'>), 9223371950454.773, (GlobalWindow,), PaneInfo(first: True, last: True, timing: ON_TIME, index: 0, nonspeculative_index: 0))] in <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/1 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[DeterministicFastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) output_tags=['SINGLE_PARTITION', 'MULTIPLE_PARTITIONS', 'None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], ConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out2, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=2]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) output_tags=['None', 'TemporaryTables'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=2], SingletonConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out1, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/0 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/0.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <FlattenOperation write/BigQueryBatchFileLoads/Flatten/Transcode/1 receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_35/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_34/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation ref_PCollection_PCollection_58/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/Flatten/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation write/BigQueryBatchFileLoads/Flatten/Write/1 >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the bundle bundle_52 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Impulse_46)+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-FlatMap-lambda-at-core-py-_47))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ImpulseMonitorLoadJobs-Map-decode-_49))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-WaitForTempTableLoadJobs_50))+(ref_AppliedPTransform_write-BigQueryBatchFileLoads-ParDo-UpdateDestinationSchema-_51))+(ref_PCollection_PCollection_39/Write))+(ref_PCollection_PCollection_40/Write)
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata:80
DEBUG:urllib3.connectionpool:http://metadata:80 "GET /computeMetadata/v1/instance/attributes/job_id HTTP/1.1" 404 1606
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_40/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation ref_PCollection_PCollection_39/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs output_tags=['None'], receivers=[ConsumerSet[write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=2]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Map(decode).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>) output_tags=['None'], receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/FlatMap(<lambda at core.py:2930>).out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
INFO:root:Job status: RUNNING
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
Terminated

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED
Terminated
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=0b045be3-77d3-4d08-9e9f-2d8c23311b13, currentDir=<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 16857
  log file: /home/jenkins/.gradle/daemon/6.8.3/daemon-16857.out.log
----- Last  20 lines from daemon log file - daemon-16857.out.log -----
	at org.gradle.process.internal.DefaultExecHandle.execExceptionFor(DefaultExecHandle.java:241)
	at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:218)
	at org.gradle.process.internal.DefaultExecHandle.failed(DefaultExecHandle.java:369)
	at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:87)
	at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalStateException: Shutdown in progress
	at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
	at java.lang.Runtime.removeShutdownHook(Runtime.java:239)
	at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:33)
	at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208)
	at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:365)
	at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108)
	at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84)
	... 7 more
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #3877

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3877/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org