You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/09/05 01:04:17 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #244

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/244/display/redirect>

Changes:


------------------------------------------
[...truncated 22.43 MB...]
            "value": "RETRY_ALWAYS"
          },
          {
            "key": "create_disposition",
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn",
            "type": "STRING",
            "value": "CREATE_IF_NEEDED"
          },
          {
            "key": "write_disposition",
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn",
            "type": "STRING",
            "value": "WRITE_APPEND"
          },
          {
            "key": "additional_bq_parameters",
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn",
            "type": "STRING",
            "value": "{}"
          },
          {
            "key": "ignore_insert_ids",
            "namespace": "apache_beam.io.gcp.bigquery.BigQueryWriteFn",
            "type": "STRING",
            "value": "False"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out"
          },
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "FailedRows",
            "user_name": "WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).FailedRows"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s29"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_48",
        "user_name": "WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-09-05T00:04:12.109920Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-09-04_17_04_10-14416228897744553399'
 location: 'us-central1'
 name: 'beamapp-jenkins-0905000401-132872'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-09-05T00:04:12.109920Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-09-04_17_04_10-14416228897744553399]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2020-09-04_17_04_10-14416228897744553399
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_17_04_10-14416228897744553399?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-09-04_17_04_10-14416228897744553399 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:14.869Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.410Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.414Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.422Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.447Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.465Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.485Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.489Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateUserScores/LeaderboardUserGlobalWindows into AddEventTimestamps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.492Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateTeamScores/LeaderboardTeamFixedWindows into AddEventTimestamps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.494Z: JOB_MESSAGE_DETAILED: Fusing consumer DecodeString into ReadPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.496Z: JOB_MESSAGE_DETAILED: Fusing consumer ParseGameEventFn into DecodeString
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.499Z: JOB_MESSAGE_DETAILED: Fusing consumer AddEventTimestamps into ParseGameEventFn
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.501Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateTeamScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>) into CalculateTeamScores/LeaderboardTeamFixedWindows
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.504Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/WriteStream into CalculateTeamScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.506Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets into CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.509Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/Combine into CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.512Z: JOB_MESSAGE_DETAILED: Fusing consumer TeamScoresDict into CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.514Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/ConvertToRow into TeamScoresDict
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.516Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination into WriteTeamScoreSums/ConvertToRow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.518Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.520Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps) into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.523Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/WriteStream into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.525Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.528Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps) into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.530Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.532Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.535Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateUserScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>) into CalculateUserScores/LeaderboardUserGlobalWindows
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.537Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/WriteStream into CalculateUserScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.539Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets into CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.541Z: JOB_MESSAGE_DETAILED: Fusing consumer CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/Combine into CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.543Z: JOB_MESSAGE_DETAILED: Fusing consumer FormatUserScoreSums into CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.546Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/ConvertToRow into FormatUserScoreSums
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.548Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination into WriteUserScoreSums/ConvertToRow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.551Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.553Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps) into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.556Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/WriteStream into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.561Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.563Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps) into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.565Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.568Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.587Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.605Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.647Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.856Z: JOB_MESSAGE_DEBUG: Executing wait step start2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.868Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:15.873Z: JOB_MESSAGE_BASIC: Starting 1 workers...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:19.343Z: JOB_MESSAGE_BASIC: Executing operation WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/ReadStream+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:19.345Z: JOB_MESSAGE_BASIC: Executing operation CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/ReadStream+CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets+CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/Combine+FormatUserScoreSums+WriteUserScoreSums/ConvertToRow+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps)+WriteUserScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:19.345Z: JOB_MESSAGE_BASIC: Executing operation ReadPubSub/Read+DecodeString+ParseGameEventFn+AddEventTimestamps+CalculateUserScores/LeaderboardUserGlobalWindows+CalculateTeamScores/LeaderboardTeamFixedWindows+CalculateTeamScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>)+CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/WriteStream+CalculateUserScores/ExtractAndSumScore/Map(<lambda at leader_board.py:155>)+CalculateUserScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:19.346Z: JOB_MESSAGE_BASIC: Executing operation CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/ReadStream+CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/MergeBuckets+CalculateTeamScores/ExtractAndSumScore/CombinePerKey(sum)/Combine+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/AddInsertIdsWithRandomKeys+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/Map(reify_timestamps)+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:19.348Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/ReadStream+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/GroupByKey/MergeBuckets+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/DropShard+WriteTeamScoreSums/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:42.374Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:49.783Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:04:50.182Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-05T00:05:16.988Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-09-04_17_04_10-14416228897744553399 after 603 seconds
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query SELECT total_score FROM `apache-beam-testing.leader_board_it_dataset15992642203253.leader_board_users` WHERE total_score=5000 LIMIT 1 to BQ
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/4d21c52d-9e55-4673-a011-e038e00af40e?maxResults=0&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7efa9521_5196_4a8d_b9a4_393e3a9b14c0/data HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Read from given query (SELECT total_score FROM `apache-beam-testing.leader_board_it_dataset15992642203253.leader_board_users` WHERE total_score=5000 LIMIT 1), total rows 0
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Generate checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/leader_board_it_dataset15992642203253?deleteContents=true HTTP/1.1" 204 0
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3646.239s

FAILED (SKIP=7, failures=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 44s
172 actionable tasks: 132 executed, 36 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/uvaouwwdoiz32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #248

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/248/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #247

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/247/display/redirect>

Changes:


------------------------------------------
[...truncated 15.78 MB...]
                {
                  "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-09-05T19:02:08.775115Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-09-05_12_02_07-16679474707356843510'
 location: 'us-central1'
 name: 'beamapp-jenkins-0905190201-454424'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-09-05T19:02:08.775115Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-09-05_12_02_07-16679474707356843510]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-09-05_12_02_07-16679474707356843510
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_12_02_07-16679474707356843510?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_12_02_07-16679474707356843510 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:07.431Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-09-05_12_02_07-16679474707356843510. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:07.431Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-09-05_12_02_07-16679474707356843510.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:11.546Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.317Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.354Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.389Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.416Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.483Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.523Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.551Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.584Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.616Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.648Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.683Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.717Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.751Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.782Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.815Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:13.842Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.022Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.144Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.188Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.219Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.265Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.330Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:14.395Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:23.387Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:26.725Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:26.790Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:26.848Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:26.914Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:31.864Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:31.905Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:31.944Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:36.077Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:36.140Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:36.211Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:36.253Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:36.284Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_11_56_31-6163463127534091349 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15993321813602.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8314e2b4-4abe-4f26-a521-1ddc48403a2a?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon25e937a4f8a0c1fad3eceb09289acf4be131024c/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15993321813602.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:43.840Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:02:46.266Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:03:25.700Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:03:25.743Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:03:25.779Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_11_57_32-17691164415255625665 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:03:55.839Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:03:55.874Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:02.608Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:05.764Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:06.039Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:06.093Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:06.152Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:15.090Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:15.152Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:15.228Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:15.310Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:04:15.348Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:05:07.087Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:05:07.133Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:05:07.171Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_11_59_18-6890186143421663124 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:19.743Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:19.815Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:19.865Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:19.925Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:29.036Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:29.108Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:29.180Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:29.237Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:07:29.267Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:08:24.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:08:24.924Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T19:08:24.961Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_12_02_07-16679474707356843510 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3803.126s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':release:go-licenses:java:dockerRun'.
> Process 'command 'docker'' finished with non-zero exit value 125

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 8s
168 actionable tasks: 128 executed, 36 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qyskrq6ncdcr2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #246

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/246/display/redirect>

Changes:


------------------------------------------
[...truncated 21.83 MB...]
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-09-05T12:58:04.128580Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-09-05_05_58_02-11621831926804639766'
 location: 'us-central1'
 name: 'beamapp-jenkins-0905125752-522782'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-09-05T12:58:04.128580Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-09-05_05_58_02-11621831926804639766]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-09-05_05_58_02-11621831926804639766
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_05_58_02-11621831926804639766?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_05_58_02-11621831926804639766 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:02.673Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-09-05_05_58_02-11621831926804639766.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:02.673Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-09-05_05_58_02-11621831926804639766. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:06.465Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.432Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.461Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.500Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.532Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.594Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.641Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.675Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.712Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.746Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.772Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.805Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.840Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.876Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.911Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.945Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:07.974Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.102Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.242Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.283Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.317Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.354Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.411Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:08.479Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:38.558Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:42.180Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:42.223Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:42.261Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_05_52_47-18108715164134992897 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:58:47.554Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15993103579881.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/89f7cb72-b66c-47fd-b29a-4f6da84cc6f5?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anond1e49766bd9af9c801563a33fc7fef048dcd0915/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15993103579881.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:59:58.013Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T12:59:59.585Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:02.710Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:02.777Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:02.815Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:02.882Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.123Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.199Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.251Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.328Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.253Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:01.295Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:12.076Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:12.140Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:12.203Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:12.249Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:12.286Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:10.466Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:10.535Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:10.609Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:10.652Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:00:10.690Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:04.771Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:04.812Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:04.850Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:02.607Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:02.642Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:01:02.753Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_05_54_45-8940655063093754623 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_05_54_59-10138623180066741880 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:04.891Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:04.955Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:05.007Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:05.071Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:14.150Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:14.207Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:14.269Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:14.302Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:14.335Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:56.290Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:56.334Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T13:03:56.369Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_05_58_02-11621831926804639766 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3671.131s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 55

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 48s
172 actionable tasks: 132 executed, 36 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4vwqgktpldqao

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #245

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/245/display/redirect>

Changes:


------------------------------------------
[...truncated 15.77 MB...]
                    },
                    {
                      "@type": "FastPrimitivesCoder$QlpoOTFBWSZTWSYAoHgAAC334EwIiCAAIQkAQAC/bt0CYAQOACAAahpI0Bppo0GmRtINBqJ6QZAaNDQDyjRqC0RLwD4QPbijFck85Haypu6ovZAReEmQrcsyHc7KzNN4sFkXh7FICbooOXgwoVC0QQeEDWJD5shM4hP8XckU4UJAmAKB4A==",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-09-05T07:00:38.281916Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-09-05_00_00_36-8987295277847361345'
 location: 'us-central1'
 name: 'beamapp-jenkins-0905070030-298239'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-09-05T07:00:38.281916Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-09-05_00_00_36-8987295277847361345]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-09-05_00_00_36-8987295277847361345
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-05_00_00_36-8987295277847361345?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_00_00_36-8987295277847361345 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:37.021Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-09-05_00_00_36-8987295277847361345.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:37.021Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-09-05_00_00_36-8987295277847361345. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:41.395Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.260Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.318Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.377Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.443Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.563Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.664Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.712Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.828Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.882Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:43.927Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.051Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.117Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.162Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.204Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.247Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.301Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.529Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.660Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.726Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.790Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.845Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:44.954Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:00:45.066Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:12.063Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:13.419Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:28.364Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_4385537994358379175". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_4385537994358379175".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:39.004Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_4385537994358379175" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:39.890Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:39.995Z: JOB_MESSAGE_DEBUG: Executing success step success1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:40.275Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:40.346Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:01:40.394Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:00.058Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:03.207Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:03.296Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:03.349Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:03.433Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:12.633Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:12.716Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:12.807Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:12.862Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:12.899Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:28.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:28.869Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:28.921Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-04_23_56_52-15939406931175851059 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_1599289001549.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8c6abd4c-a029-40ed-82a1-433743de3aa4?maxResults=0&location=US HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon2583bab6aa0620c34beb16629939a983792e3194/data HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_1599289001549.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:40.827Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:02:40.892Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:18.118Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:20.210Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:20.334Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:20.386Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:24.493Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:24.580Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:24.635Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:24.719Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-04_23_56_48-7605070405320245279 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:33.873Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:33.953Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:34.045Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:34.101Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:03:34.123Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:04:37.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:04:37.073Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:04:37.106Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-04_23_58_20-2127698084720724257 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:05:54.020Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:05:54.153Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:05:54.219Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:05:54.312Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:01.317Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:01.439Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:01.572Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:01.649Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:01.700Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:45.906Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:45.982Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-05T07:06:46.031Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-05_00_00_36-8987295277847361345 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3714.987s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':release:go-licenses:java:dockerRun'.
> Process 'command 'docker'' finished with non-zero exit value 125

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 36s
168 actionable tasks: 128 executed, 36 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/icfnseqdxa2yg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org