You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/10 12:55:22 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1092

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1092/display/redirect>

------------------------------------------
[...truncated 336.24 KB...]
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kMlqwzAQhp2mS6KkS9K9T5Be/BSlCwX3EqgvQUj22BHIkkcLIQdDeyl97MpOoOSQo74Zff/wf/VnGatZtgTKgVWx0HGZ1TEXJXowa1oICVRqllvyBBIczBmXYJ8VwejxG3sNHszSURRFDqyjmRSgHPaTDlG3roEuhXIWD3dS2kHH4xwybZjTxpK3j3nAry0meBTkx0mDJxu7ULV3nc/iIEnHAWnv/tkw8T9IuOcLHDU4XuDpbp5hyhbaVDYOcUA+hcr1SqiS4FkIOm/wYpYOgnTVDQqFk33/NxvkRWrO5MYTzp0Gy2U6bHswoizBBMXVPsV2JRRaMC/dfPvE6yC5SSdBwrLMV14yJ7Silc4Bb9976bTViyoUzaqaZrriQoHBuzDq6haW5hsl3v967vAh/gNL9qPV",
        "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-06-10T12:27:52.916371Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-10_05_27_51-14313341824023848865'
 location: 'us-central1'
 name: 'beamapp-jenkins-0610122742-955544'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-10T12:27:52.916371Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-10_05_27_51-14313341824023848865]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_27_51-14313341824023848865?project=apache-beam-testing
root: INFO: Job 2019-06-10_05_27_51-14313341824023848865 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-10T12:27:51.945Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-06-10_05_27_51-14313341824023848865. The number of workers will be between 1 and 1000.
root: INFO: 2019-06-10T12:27:52.026Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-06-10_05_27_51-14313341824023848865.
root: INFO: 2019-06-10T12:27:54.887Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-10T12:27:55.594Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
root: INFO: 2019-06-10T12:27:56.234Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-10T12:27:56.318Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner.
root: INFO: 2019-06-10T12:27:56.379Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-06-10T12:27:56.486Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-06-10T12:27:56.571Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-06-10T12:27:56.613Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-10T12:27:56.777Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-10T12:27:56.821Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-10T12:27:56.853Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s11.out_WrittenFiles
root: INFO: 2019-06-10T12:27:56.898Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-10T12:27:56.947Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-06-10T12:27:56.992Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-06-10T12:27:57.041Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17-u31 for input s18-reify-value9-c29
root: INFO: 2019-06-10T12:27:57.087Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-06-10T12:27:57.133Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-06-10T12:27:57.165Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Create/Read
root: INFO: 2019-06-10T12:27:57.207Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow into Create/Read
root: INFO: 2019-06-10T12:27:57.254Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-06-10T12:27:57.296Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-06-10T12:27:57.343Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-06-10T12:27:57.391Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-06-10T12:27:57.429Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-06-10T12:27:57.473Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-06-10T12:27:57.517Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-06-10T12:27:57.563Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-06-10T12:27:57.612Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-06-10T12:27:57.659Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-10T12:27:57.701Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-06-10T12:27:57.735Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-06-10T12:27:57.781Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-06-10T12:27:57.829Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-06-10T12:27:57.878Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-06-10T12:27:57.923Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-06-10T12:27:57.973Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-06-10T12:27:58.024Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-06-10T12:27:58.072Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-06-10T12:27:58.120Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-06-10T12:27:58.165Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-06-10T12:27:58.213Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-06-10T12:27:58.257Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-10T12:27:58.298Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-10T12:27:58.345Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-10T12:27:58.395Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-10T12:27:58.590Z: JOB_MESSAGE_DEBUG: Executing wait step start44
root: INFO: 2019-06-10T12:27:58.687Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>)
root: INFO: 2019-06-10T12:27:58.734Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-06-10T12:27:58.746Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-10T12:27:58.782Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
root: INFO: 2019-06-10T12:27:58.783Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-06-10T12:27:58.818Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-06-10T12:27:58.869Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-06-10T12:27:58.905Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-06-10T12:27:58.938Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-06-10T12:27:58.974Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-06-10T12:28:43.729Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-10T12:29:49.552Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-10T12:29:49.593Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-10T12:32:11.556Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>).out" materialized.
root: INFO: 2019-06-10T12:32:11.657Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:550>).out.0)
root: INFO: 2019-06-10T12:32:11.705Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:550>).out.0)
root: INFO: 2019-06-10T12:32:11.768Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:550>).out.0).output" materialized.
root: INFO: 2019-06-10T12:32:11.814Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:550>).out.0).output" materialized.
root: INFO: 2019-06-10T12:32:15.058Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
root: INFO: 2019-06-10T12:32:15.149Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-06-10T12:32:15.194Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-06-10T12:32:15.283Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-06-10T12:32:15.320Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-06-10T12:32:15.495Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow+WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write
root: INFO: Deleting dataset python_bq_streaming_inserts_1560169662368 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_50-15115308850018183966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_17_40-6676340780763725266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_24_32-4266232287163109849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_31_07-5732750803309320967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_41_33-1220737257586941734?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_46-11385519803588890707?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_24_59-16889199203017240924?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_32_43-5545540298602776574?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_55-6016935009269947790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_15_34-14494586235255323022?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_24_41-1882801536709766898?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_33_37-14045209692176676309?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_47-15454218438833196233?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_23_19-7809822085668266590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_31_40-9634173479348395914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_40_12-12871717454745387806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_47-1065652139830142466?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_11_05-11852593911558129974?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_20_26-1384694382227658356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_28_14-1662121198052079060?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_45-9735295878384622099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_11_02-11571591634127614001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_18_23-10984842509253405586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_25_49-12281204265292543551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_33_36-3881147579181934766?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_49-9900019972716592116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_10_49-13501584907490939336?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_20_30-8771331484439876745?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_28_10-9403163780175354766?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_37_26-15279422790918692379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_45_12-9648739907335425792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_01_48-1711509880229503173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_10_46-13225795021024413549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_20_08-12010113697865311559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-10_05_27_51-14313341824023848865?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-4:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 668, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-10_05_27_51-14313341824023848865?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Mon, 10 Jun 2019 12:32:56 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 404,
    "message": "(34f1b1ce65ed955d): Information about job 2019-06-10_05_27_51-14313341824023848865 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>


----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3229.117s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 57s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/ak7bl4kw4osig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #1093

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1093/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org