You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/03/06 06:59:18 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #7575

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7575/display/redirect>

------------------------------------------
[...truncated 642.77 KB...]
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s35"
        }, 
        "serialized_fn": "<string of 976 bytes>", 
        "user_name": "Write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s37", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DeleteTablesFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s36"
        }, 
        "serialized_fn": "<string of 412 bytes>", 
        "user_name": "Write/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-03-06T06:08:33.386734Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-03-05_22_08_32-12752060957670175603'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306060823-462489'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-03-06T06:08:33.386734Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-05_22_08_32-12752060957670175603]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_32-12752060957670175603?project=apache-beam-testing
root: INFO: Job 2019-03-05_22_08_32-12752060957670175603 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-06T06:08:32.451Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-05_22_08_32-12752060957670175603. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-06T06:08:32.609Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-05_22_08_32-12752060957670175603.
root: INFO: 2019-03-06T06:08:35.505Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-06T06:08:36.542Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-03-06T06:08:37.185Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T06:08:37.313Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T06:08:37.366Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T06:08:37.415Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T06:08:37.458Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-06T06:08:37.978Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-06T06:08:38.114Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-06T06:08:38.157Z: JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s13.out_WrittenFiles
root: INFO: 2019-03-06T06:08:38.195Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten Write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T06:08:38.236Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T06:08:38.268Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T06:08:38.320Z: JOB_MESSAGE_DETAILED: Unzipping flatten s19-u58 for input s20-reify-value27-c56
root: INFO: 2019-03-06T06:08:38.361Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten Write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T06:08:38.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into Write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T06:08:38.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T06:08:38.494Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Reify into monthly count/GroupByKey+monthly count/Combine/Partial
root: INFO: 2019-03-06T06:08:38.532Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into Write/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T06:08:38.576Z: JOB_MESSAGE_DETAILED: Fusing consumer format into monthly count/Combine/Extract
root: INFO: 2019-03-06T06:08:38.625Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/AppendDestination into Write/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T06:08:38.685Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey+monthly count/Combine/Partial into months with tornadoes
root: INFO: 2019-03-06T06:08:38.732Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/ApplyGlobalWindow into format
root: INFO: 2019-03-06T06:08:38.776Z: JOB_MESSAGE_DETAILED: Fusing consumer months with tornadoes into read
root: INFO: 2019-03-06T06:08:38.813Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine/Extract into monthly count/Combine
root: INFO: 2019-03-06T06:08:38.864Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Write into monthly count/GroupByKey/Reify
root: INFO: 2019-03-06T06:08:38.907Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into Write/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T06:08:38.962Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/DropShardNumber into Write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T06:08:39.003Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine into monthly count/GroupByKey/Read
root: INFO: 2019-03-06T06:08:39.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupShardedRows/Write into Write/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T06:08:39.080Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into Write/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T06:08:39.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T06:08:39.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GroupShardedRows/Reify into Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T06:08:39.210Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/GenerateFilePrefix into Write/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T06:08:39.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T06:08:39.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T06:08:39.337Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into Write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T06:08:39.382Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T06:08:39.430Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into Write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T06:08:39.464Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into Write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T06:08:39.517Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into Write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T06:08:39.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T06:08:39.597Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into Write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T06:08:39.627Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into Write/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T06:08:39.691Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/Delete into Write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T06:08:39.726Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T06:08:39.747Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T06:08:39.799Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-06T06:08:39.819Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-06T06:08:39.858Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-06T06:08:39.902Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-06T06:08:40.111Z: JOB_MESSAGE_DEBUG: Executing wait step start72
root: INFO: 2019-03-06T06:08:40.203Z: JOB_MESSAGE_BASIC: Executing operation Write/BigQueryBatchFileLoads/ImpulseJobName/Read+Write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T06:08:40.251Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-06T06:08:40.255Z: JOB_MESSAGE_BASIC: Executing operation Write/BigQueryBatchFileLoads/CreateFilePrefixView/Read+Write/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T06:08:40.294Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-03-06T06:08:40.294Z: JOB_MESSAGE_BASIC: Executing operation Write/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T06:08:40.344Z: JOB_MESSAGE_BASIC: Executing operation Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T06:08:40.378Z: JOB_MESSAGE_BASIC: Executing operation monthly count/GroupByKey/Create
root: INFO: 2019-03-06T06:08:40.422Z: JOB_MESSAGE_BASIC: Executing operation Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T06:08:40.469Z: JOB_MESSAGE_DEBUG: Value "Write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T06:08:40.517Z: JOB_MESSAGE_DEBUG: Value "Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T06:08:40.566Z: JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T06:08:40.607Z: JOB_MESSAGE_DEBUG: Value "Write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T06:08:40.655Z: JOB_MESSAGE_BASIC: Executing operation read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write
root: INFO: 2019-03-06T06:08:41.250Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_18161262152394465692" started. You can check its status with the bq tool: "bq show -j --project_id=clouddataflow-readonly dataflow_job_18161262152394465692".
root: INFO: 2019-03-06T06:08:57.099Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T06:09:11.671Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_18161262152394465692" observed total of 1 exported files thus far.
root: INFO: 2019-03-06T06:09:11.751Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_18161262152394465692"
root: INFO: 2019-03-06T06:09:31.124Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T06:10:11.914Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-06T06:10:12.072Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 29 tests in 3054.916s

FAILED (SKIP=1, errors=2, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_36-10444656817925785307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_23_23-13147682064142184609?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_32-1329246710157630395?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_34-16367907733248289654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_20_41-14775840123229782966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_30_24-7625114548183074240?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_41_06-4791680811536682276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_35-3772432671455605337?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_27_33-11146321559778058278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_33-12831697396166856828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_16_24-9184037182707021734?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_23_51-8068018440860552548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_32-12752060957670175603?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_13_49-2263378864747813372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_21_38-15556450956670679086?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_29_45-14376827364100336525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_32-9005388181542931825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_16_28-14258683838273117291?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_24_10-6917145238907081720?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_31_42-7180979809744164118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_39_58-6065836691940193640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_46_33-875404404603148310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_53_04-13320401966391532693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_08_33-8673137190772844377?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_16_53-11128755611883896682?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_26_45-14099706461626608494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-05_22_35_17-7193942181287715470?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 278

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 6s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/fplq7tjlyljzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #7580

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7580/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7579

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7579/display/redirect?page=changes>

Changes:

[github] Add blog post for 2.11 release. (#7996)

[github] Typo fixes to the Beam 2.11 blog.

------------------------------------------
[...truncated 513.27 KB...]
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s32"
        }, 
        "serialized_fn": "<string of 324 bytes>", 
        "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-03-06T19:21:52.225443Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-03-06_11_21_48-10732408580585692170'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306192139-869928'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-03-06T19:21:52.225443Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-06_11_21_48-10732408580585692170]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170?project=apache-beam-testing
root: INFO: Job 2019-03-06_11_21_48-10732408580585692170 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-06T19:21:48.548Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-06_11_21_48-10732408580585692170. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-06T19:21:48.612Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-06_11_21_48-10732408580585692170.
root: INFO: 2019-03-06T19:21:54.379Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-06T19:21:55.337Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-03-06T19:21:55.940Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T19:21:56.011Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T19:21:56.066Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T19:21:56.133Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T19:21:56.197Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-06T19:21:56.397Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-06T19:21:56.600Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-06T19:21:56.655Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s14.out
root: INFO: 2019-03-06T19:21:56.708Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T19:21:56.753Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T19:21:56.833Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T19:21:56.875Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15-u40 for input s16-reify-value18-c38
root: INFO: 2019-03-06T19:21:56.922Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T19:21:56.971Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T19:21:57.017Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T19:21:57.068Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into write/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T19:21:57.119Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/AppendDestination into write/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T19:21:57.168Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into write/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T19:21:57.203Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/DropShardNumber into write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T19:21:57.244Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Write into write/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T19:21:57.303Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Reify into write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T19:21:57.351Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into write/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T19:21:57.404Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ApplyGlobalWindow into read
root: INFO: 2019-03-06T19:21:57.456Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T19:21:57.511Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GenerateFilePrefix into write/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T19:21:57.555Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T19:21:57.606Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T19:21:57.656Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T19:21:57.695Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T19:21:57.754Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T19:21:57.806Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T19:21:57.858Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T19:21:57.910Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T19:21:57.974Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T19:21:58.044Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T19:21:58.096Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into write/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T19:21:58.148Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T19:21:58.185Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/Delete into write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T19:21:58.251Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-06T19:21:58.309Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-06T19:21:58.356Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-06T19:21:58.401Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-06T19:21:58.656Z: JOB_MESSAGE_DEBUG: Executing wait step start53
root: INFO: 2019-03-06T19:21:58.762Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseJobName/Read+write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T19:21:58.814Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/CreateFilePrefixView/Read+write/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T19:21:58.826Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-06T19:21:58.871Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T19:21:58.871Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-03-06T19:21:58.927Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T19:21:58.977Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T19:21:59.108Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T19:21:59.240Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T19:21:59.302Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T19:22:08.478Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T19:22:56.089Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A22%3A56.089Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A22%3A56.089Z>: response: <{'status': '503', 'content-length': '122', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 19:24:18 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "The service is currently unavailable.",
    "status": "UNAVAILABLE"
  }
}
>
root: INFO: 2019-03-06T19:23:45.550Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-06T19:23:45.591Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-06T19:25:56.940Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>).out" materialized.
root: INFO: 2019-03-06T19:25:57.060Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0)
root: INFO: 2019-03-06T19:25:57.114Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0)
root: INFO: 2019-03-06T19:25:57.196Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0).output" materialized.
root: INFO: 2019-03-06T19:25:57.232Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0).output" materialized.
root: INFO: 2019-03-06T19:26:00.400Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
root: INFO: 2019-03-06T19:26:00.529Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-03-06T19:26:00.609Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-03-06T19:26:00.717Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-03-06T19:26:00.764Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-03-06T19:26:00.902Z: JOB_MESSAGE_BASIC: Executing operation read+write/BigQueryBatchFileLoads/ApplyGlobalWindow+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
root: INFO: 2019-03-06T19:26:01.219Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_10452321233229336488". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_10452321233229336488".
root: INFO: 2019-03-06T19:27:47.149Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_10452321233229336488"
root: INFO: 2019-03-06T19:27:47.612Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_3634672239312776342" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_3634672239312776342".
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A27%3A47.612Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A27%3A47.612Z>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 19:28:38 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: INFO: 2019-03-06T19:28:17.990Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_3634672239312776342" observed total of 1 exported files thus far.
root: INFO: 2019-03-06T19:28:18.034Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_3634672239312776342"
root: INFO: 2019-03-06T19:28:21.447Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
root: INFO: 2019-03-06T19:28:21.592Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
root: INFO: 2019-03-06T19:28:24.617Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
root: INFO: 2019-03-06T19:28:24.752Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-03-06T19:28:38.291Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs).out" materialized.
root: INFO: 2019-03-06T19:28:38.349Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
root: INFO: 2019-03-06T19:28:38.402Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-03-06T19:28:38.464Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-03-06T19:28:38.531Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
root: INFO: 2019-03-06T19:28:38.607Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
root: INFO: 2019-03-06T19:28:38.671Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-03-06T19:28:45.219Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
root: INFO: 2019-03-06T19:28:45.318Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-03-06T19:28:45.446Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
root: INFO: 2019-03-06T19:28:45.583Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write
root: INFO: 2019-03-06T19:28:50.358Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Close
root: INFO: 2019-03-06T19:28:50.476Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-03-06T19:28:53.494Z: JOB_MESSAGE_DEBUG: Executing success step success51
root: INFO: 2019-03-06T19:28:53.646Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-03-06T19:28:53.791Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-03-06T19:28:53.864Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A28%3A53.864Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170/messages?alt=json&startTime=2019-03-06T19%3A28%3A53.864Z>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 19:29:41 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 29 tests in 3153.580s

FAILED (SKIP=1, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_32-17183379432656637747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_01_24-16884156586994520400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_09_54-2999771900743305825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_16_31-349295147300128161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_23_45-1456504733885548291?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_31_51-7896473373324614040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_38_13-14841954449776048225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_34-12228643475699243118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_09_30-17327525865141191754?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_18_13-17549505521041121306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_32-9382652417395514548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_34-15860407522935474285?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_07_58-5643058555413116683?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_17_56-9113601783806509999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_21_48-10732408580585692170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_32-15713095880058592217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_12_17-2496003328366576614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_19_16-14574602040598072740?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_31-9189380508906770579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_01_14-11692067158254947217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_09_39-1949303573121731032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_32-4824031441702846218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_02_13-5255907761185977109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_09_49-7638207134077862011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_53_32-3174171005449491739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_02_07-5490030322212926432?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_11_12_52-4636051688483197520?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 278

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 35s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/qdzya6b7zdmas

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7578

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7578/display/redirect>

------------------------------------------
[...truncated 1.96 MB...]
      "name": "s68", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DeleteTablesFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s67"
        }, 
        "serialized_fn": "<string of 324 bytes>", 
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-03-06T18:18:00.027192Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-03-06_10_17_59-10782356415447103105'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306181750-819300'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-03-06T18:18:00.027192Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-06_10_17_59-10782356415447103105]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_17_59-10782356415447103105?project=apache-beam-testing
root: INFO: Job 2019-03-06_10_17_59-10782356415447103105 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-06T18:17:59.244Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-06_10_17_59-10782356415447103105. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-06T18:17:59.327Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-06_10_17_59-10782356415447103105.
root: INFO: 2019-03-06T18:18:02.607Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-06T18:18:03.491Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-03-06T18:18:04.166Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T18:18:04.253Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T18:18:04.321Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T18:18:04.382Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T18:18:04.450Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T18:18:04.508Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T18:18:04.566Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T18:18:04.618Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-06T18:18:05.119Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-06T18:18:05.255Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-06T18:18:05.323Z: JOB_MESSAGE_DETAILED: Unzipping flatten s18 for input s12.out_WrittenFiles
root: INFO: 2019-03-06T18:18:05.420Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T18:18:05.481Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T18:18:05.545Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T18:18:05.596Z: JOB_MESSAGE_DETAILED: Unzipping flatten s50 for input s44.out_WrittenFiles
root: INFO: 2019-03-06T18:18:05.650Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T18:18:05.717Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T18:18:05.759Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T18:18:05.817Z: JOB_MESSAGE_DETAILED: Unzipping flatten s18-u85 for input s19-reify-value27-c83
root: INFO: 2019-03-06T18:18:05.881Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T18:18:05.926Z: JOB_MESSAGE_DETAILED: Unzipping flatten s50-u90 for input s51-reify-value63-c88
root: INFO: 2019-03-06T18:18:05.982Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T18:18:06.079Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow into FlatMap(<lambda at bigquery_file_loads_test.py:384>)
root: INFO: 2019-03-06T18:18:06.116Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow into FlatMap(<lambda at bigquery_file_loads_test.py:384>)
root: INFO: 2019-03-06T18:18:06.175Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T18:18:06.232Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T18:18:06.296Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T18:18:06.344Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T18:18:06.401Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T18:18:06.452Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
root: INFO: 2019-03-06T18:18:06.494Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T18:18:06.556Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T18:18:06.615Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T18:18:06.665Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T18:18:06.727Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into Map(<lambda at bigquery_file_loads_test.py:382>)
root: INFO: 2019-03-06T18:18:06.776Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at bigquery_file_loads_test.py:382>) into Create/Read
root: INFO: 2019-03-06T18:18:06.826Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T18:18:06.872Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
root: INFO: 2019-03-06T18:18:06.914Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T18:18:06.959Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T18:18:07.005Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T18:18:07.052Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at bigquery_file_loads_test.py:384>) into GroupByKey/GroupByWindow
root: INFO: 2019-03-06T18:18:07.126Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T18:18:07.207Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T18:18:07.271Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T18:18:07.339Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T18:18:07.410Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T18:18:07.464Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T18:18:07.516Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T18:18:07.565Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T18:18:07.621Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T18:18:07.681Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T18:18:07.727Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T18:18:07.776Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T18:18:07.828Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T18:18:07.904Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T18:18:07.953Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T18:18:07.993Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T18:18:08.047Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T18:18:08.113Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T18:18:08.158Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T18:18:08.208Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T18:18:08.253Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T18:18:08.298Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T18:18:08.347Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T18:18:08.408Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T18:18:08.443Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T18:18:08.499Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T18:18:08.540Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T18:18:08.589Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T18:18:08.644Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T18:18:08.700Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T18:18:08.753Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T18:18:08.797Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T18:18:08.849Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T18:18:08.883Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T18:18:08.938Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T18:18:08.995Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-06T18:18:09.052Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-06T18:18:09.099Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-06T18:18:09.153Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-06T18:18:09.493Z: JOB_MESSAGE_DEBUG: Executing wait step start113
root: INFO: 2019-03-06T18:18:09.647Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T18:18:09.696Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T18:18:09.733Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-06T18:18:09.746Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T18:18:09.792Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-03-06T18:18:09.803Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T18:18:09.869Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T18:18:09.940Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T18:18:09.995Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
root: INFO: 2019-03-06T18:18:10.061Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T18:18:10.123Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T18:18:10.208Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T18:18:10.248Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T18:18:10.338Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T18:18:10.402Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T18:18:10.470Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
root: INFO: 2019-03-06T18:18:10.525Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T18:18:10.590Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T18:18:10.650Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T18:18:10.714Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T18:18:10.768Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:382>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-03-06T18:18:21.061Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_10_17_59-10782356415447103105/messages?alt=json&startTime=2019-03-06T18%3A18%3A21.061Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_10_17_59-10782356415447103105/messages?alt=json&startTime=2019-03-06T18%3A18%3A21.061Z>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 18:19:25 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: INFO: 2019-03-06T18:20:33.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T18:21:23.457Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-06T18:21:23.505Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 29 tests in 2247.414s

FAILED (SKIP=1, errors=4, failures=10)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_40-11726064318150014117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_10_27-8158485014309472577?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_17_45-13384381642046850380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_26_32-14048493429517037697?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_32_49-2311575283235812148?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_42-12849074094345924936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_20_16-10963456019695246042?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_41-9529855461230571031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_17_59-7751960533498215585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_43-13779675504597693887?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_19_14-18150790730270150561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_40-6979527832607919560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_17_59-10782356415447103105?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_26_17-3418480377318970255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_40-4510256177197341201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_14_45-10130981775156417105?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_19_27-11296979788461512136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_40-2345373559979129275?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_13_33-9029147505690337165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_20_13-11232312990168586971?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_27_24-3924956616294511686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_05_40-2279883554157621948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_10_14_25-13185730238756550383?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 278

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 28s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/5fcqxmr5dsa3m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7577

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7577/display/redirect?page=changes>

Changes:

[echauchot] Fix Joda error raised by error-prone

------------------------------------------
[...truncated 1.54 MB...]
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s31"
        }, 
        "serialized_fn": "<string of 976 bytes>", 
        "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s33", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DeleteTablesFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s32"
        }, 
        "serialized_fn": "<string of 324 bytes>", 
        "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-03-06T15:08:16.550710Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-03-06_07_08_15-3338326682523013640'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306150806-763175'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-03-06T15:08:16.550710Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-06_07_08_15-3338326682523013640]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640?project=apache-beam-testing
root: INFO: Job 2019-03-06_07_08_15-3338326682523013640 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-06T15:08:15.674Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-06_07_08_15-3338326682523013640. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-06T15:08:15.773Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-06_07_08_15-3338326682523013640.
root: INFO: 2019-03-06T15:08:18.698Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-06T15:08:19.597Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-03-06T15:08:20.183Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T15:08:20.240Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T15:08:20.300Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T15:08:20.337Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T15:08:20.673Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-06T15:08:20.866Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-06T15:08:21.104Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-06T15:08:21.167Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s14.out
root: INFO: 2019-03-06T15:08:21.259Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T15:08:21.321Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T15:08:21.398Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T15:08:21.505Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15-u40 for input s16-reify-value18-c38
root: INFO: 2019-03-06T15:08:21.605Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T15:08:21.681Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T15:08:21.735Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T15:08:21.825Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into write/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T15:08:21.900Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/AppendDestination into write/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T15:08:21.953Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into write/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T15:08:22.017Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/DropShardNumber into write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T15:08:22.061Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Write into write/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T15:08:22.118Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Reify into write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T15:08:22.187Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into write/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T15:08:22.247Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ApplyGlobalWindow into read
root: INFO: 2019-03-06T15:08:22.295Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T15:08:22.343Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GenerateFilePrefix into write/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T15:08:22.400Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T15:08:22.433Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T15:08:22.466Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T15:08:22.523Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T15:08:22.557Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T15:08:22.608Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T15:08:22.672Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T15:08:22.759Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T15:08:22.805Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T15:08:22.863Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T15:08:22.914Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into write/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T15:08:22.959Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T15:08:23.016Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/Delete into write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T15:08:23.075Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-06T15:08:23.143Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-06T15:08:23.207Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-06T15:08:23.264Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-06T15:08:23.540Z: JOB_MESSAGE_DEBUG: Executing wait step start53
root: INFO: 2019-03-06T15:08:23.692Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseJobName/Read+write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T15:08:24.113Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/CreateFilePrefixView/Read+write/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T15:08:24.125Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-06T15:08:24.151Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T15:08:24.201Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-03-06T15:08:24.217Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T15:08:24.269Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T15:08:24.353Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T15:08:24.403Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T15:08:24.457Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T15:08:35.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T15:09:23.455Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-06T15:10:17.964Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-06T15:10:18.007Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640/messages?alt=json&startTime=2019-03-06T15%3A10%3A18.007Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640/messages?alt=json&startTime=2019-03-06T15%3A10%3A18.007Z>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 15:12:23 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: INFO: 2019-03-06T15:12:21.556Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
root: INFO: 2019-03-06T15:12:21.673Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-03-06T15:12:21.734Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-03-06T15:12:21.852Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-03-06T15:12:21.896Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-03-06T15:12:22.001Z: JOB_MESSAGE_BASIC: Executing operation read+write/BigQueryBatchFileLoads/ApplyGlobalWindow+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
root: INFO: 2019-03-06T15:12:22.192Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_6178600012881544721". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_6178600012881544721".
root: INFO: 2019-03-06T15:12:25.065Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>).out" materialized.
root: INFO: 2019-03-06T15:12:25.269Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0)
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640/messages?alt=json&startTime=2019-03-06T15%3A12%3A25.269Z after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640/messages?alt=json&startTime=2019-03-06T15%3A12%3A25.269Z>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 15:13:00 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: INFO: 2019-03-06T15:12:25.344Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0)
root: INFO: 2019-03-06T15:12:25.476Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0).output" materialized.
root: INFO: 2019-03-06T15:12:25.527Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:498>).out.0).output" materialized.
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640?alt=json>: response: <{'status': '503', 'content-length': '102', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 06 Mar 2019 15:14:37 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 29 tests in 2895.762s

FAILED (SKIP=1, errors=2, failures=7)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_02-17093855029966017604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_52_38-10443028367810726811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_59_18-15607507111918114217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_04-1153578279217618114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_03-18338323234331166489?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_00_26-9443480536158285638?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_00-9062598405450818296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_53_18-13567039053449448537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_01_32-15875697187510519663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_07_44-2401173242722983381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_00-568063932886908209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_55_20-16859286956165944982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_03_01-3563738840786539310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_01-15249700790262488479?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_55_29-4088430299352693492?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_05_00-13769814771957610686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_08_15-3338326682523013640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_03-5815405287088205137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_56_15-13612889832177992778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_01_08-9220452966887916482?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_48_01-12578433377549356534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_06_57_00-8479063090684257914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_00_58-15951534218752600842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_07_52-815377679972177385?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_15_18-4451169672849750715?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_22_47-4566154293585363293?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_07_29_28-11756234182830723114?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 278

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 38s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/s5rck2ipyq3ze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7576

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7576/display/redirect>

------------------------------------------
[...truncated 667.44 KB...]
      "kind": "ParallelDo", 
      "name": "s32", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s31"
        }, 
        "serialized_fn": "<string of 976 bytes>", 
        "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s33", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DeleteTablesFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s32"
        }, 
        "serialized_fn": "<string of 324 bytes>", 
        "user_name": "write/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-03-06T12:26:28.965724Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-03-06_04_26_28-9006899475303244184'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306122615-000018'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-03-06T12:26:28.965724Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-06_04_26_28-9006899475303244184]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_26_28-9006899475303244184?project=apache-beam-testing
root: INFO: Job 2019-03-06_04_26_28-9006899475303244184 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-06T12:26:28.183Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-06_04_26_28-9006899475303244184. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-06T12:26:28.214Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-06_04_26_28-9006899475303244184.
root: INFO: 2019-03-06T12:26:31.110Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-06T12:26:31.919Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-03-06T12:26:32.470Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T12:26:32.514Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T12:26:32.570Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-03-06T12:26:32.609Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-06T12:26:32.666Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-06T12:26:32.858Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-06T12:26:33.194Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-06T12:26:33.247Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s14.out
root: INFO: 2019-03-06T12:26:33.294Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-03-06T12:26:33.341Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-03-06T12:26:33.388Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-03-06T12:26:33.440Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15-u40 for input s16-reify-value18-c38
root: INFO: 2019-03-06T12:26:33.488Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T12:26:33.543Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T12:26:33.594Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-03-06T12:26:33.640Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into write/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-03-06T12:26:33.677Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/AppendDestination into write/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-03-06T12:26:33.730Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into write/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-03-06T12:26:33.781Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/DropShardNumber into write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-03-06T12:26:33.832Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Write into write/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-03-06T12:26:33.881Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/Reify into write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-03-06T12:26:33.934Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into write/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-03-06T12:26:33.982Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ApplyGlobalWindow into read
root: INFO: 2019-03-06T12:26:34.030Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-03-06T12:26:34.065Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/GenerateFilePrefix into write/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-03-06T12:26:34.112Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Read
root: INFO: 2019-03-06T12:26:34.161Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid
root: INFO: 2019-03-06T12:26:34.208Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-03-06T12:26:34.263Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Partial
root: INFO: 2019-03-06T12:26:34.314Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/RemoveTempTables/DeduplicateTables:PairWithVoid into write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-03-06T12:26:34.401Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-03-06T12:26:34.522Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine
root: INFO: 2019-03-06T12:26:34.580Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/Combine/Extract
root: INFO: 2019-03-06T12:26:34.631Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-03-06T12:26:34.673Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-03-06T12:26:34.725Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>) into write/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-03-06T12:26:34.776Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Write into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Reify
root: INFO: 2019-03-06T12:26:34.819Z: JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/RemoveTempTables/Delete into write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-03-06T12:26:34.872Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-06T12:26:34.921Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-06T12:26:34.959Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-06T12:26:35.010Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-06T12:26:35.226Z: JOB_MESSAGE_DEBUG: Executing wait step start53
root: INFO: 2019-03-06T12:26:35.324Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseJobName/Read+write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:498>)
root: INFO: 2019-03-06T12:26:35.379Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/CreateFilePrefixView/Read+write/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-03-06T12:26:35.390Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-06T12:26:35.438Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-03-06T12:26:35.438Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-03-06T12:26:35.494Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-03-06T12:26:35.548Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Create
root: INFO: 2019-03-06T12:26:35.594Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-03-06T12:26:35.638Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-03-06T12:26:35.682Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/CombinePerKey(CountCombineFn)/GroupByKey/Session" materialized.
root: INFO: 2019-03-06T12:26:45.925Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 29 tests in 2133.978s

FAILED (SKIP=1, failures=3)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_03-2279695309111449180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_17_56-10509515138824668404?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_26_28-9006899475303244184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_30_36-9549031971055427843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_03-114561109057869907?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_19_02-12985233070087030586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_27_12-58955218381870778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_04-17724030628659917815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_02-13892258904726843051?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_24_19-15213907967704388411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_02-9001979442290162872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_12_29-14344505863138060502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_19_51-12904899626870501141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_03-8495525376614338100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_12_25-15891836779545449313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_20_12-931451739067455643?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_25_58-10592031588851058052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_02-17207617215745197179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_12_41-10973827533143858896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_19_57-568092891647544755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_05_02-1158030668901401603?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_13_27-10771932763014301014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_23_14-8327754428802544094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-06_04_30_21-16538924513445927642?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 278

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 15s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/4nbppeug5pt6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org