You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/02/11 13:31:24 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #3526

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3526/display/redirect?page=changes>

Changes:

[relax] update versions

[relax] update autovalue version

[noreply] [BEAM-2914] Add portable merging window support to Python. (#12995)


------------------------------------------
[...truncated 51.33 MB...]
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWXI5+EkAApP/8H/////////////cwr///+ZrwCAAAEBAAvYZlANGGqYqeamp6IPUGGptT1GQADTQ0DQDRp6T1AHqeoBoPU9QZGTRkaCVNAmjQTTKeon6jJMTQAAAAAAAAAAAAA0aBoKaETI0GmJ6gAA0AAAAAAaNAAMgNNDQAHAAAAAAAAAAAAAAAAAAAAMjtJpzFFK4LYqyzjaFELa4VuSZq6LrLJsLgAp9yw6aAEwjBSTUyE4oiRSEJhDnlkr+AQNjGF8k++GNttBFVUVVI0QGgQgktElTZs51pkwhNCuGH3orT8g4ZUAQIYOSiGZlMMiCGAIz0OuasKyOuS9Ik8HI2BSHCiEFKMcXzQwDMHKitKyCim3MkNTwEZ9AUtsyJ+fibxgf1nfK3aotYaCkQpLJIovY7PmiTvRjxoxk1NIosScQM2oIpPHGOKAla0zdncrbnVmLLXfovDxyval1L1TvSprS8481pR3nwW5LjRT0UnvXsc4VQd3YYvCGaB08Rt9pulHKIEj0ILZuFtOO5cpK+QcL1jIqDguD44OS/8TqCWiAzig/mAiehklIPFsRKofUbX+K7pAI0G3EjKMe1tQovpjs/wTRhU1TACAFYLAIUJAHiS8Jm0ZKgmH7D+8SpKVoKnUZdIVTPwGfiHwxZzs5/VAz2lAusURJHd/Vx7TnKh4h2qbUaNcIjvyLapg9cBohcZBqAXVgwgwvllBlUMoIe3t/GoCmUxXGCuYtesFih2Qg159zLL2YbPyQzDwItV5jnsK5laCrC8stwii9NikbsSVXZRmCiVySa/ZBM96I9SjQEqaQMNj2DxyYAh3HqHox6lSlC0GtGToghKCOQE1VBHZ0GTeyAsTA9yl45wy8h/gfqKFCKoMqsD9AtKRJkLi2SSa6kXrczb6WlIwv/mlKwmnWvT2Ulf80XzRuv02tnmZc5dcJEVXUtHaa1zpmuHOauCQtV8qoiQXVVivQ4UEOmYMCPTV7rAghBHnCxDYCqKlxpwnGPHn25pghqWDG7HSooLYsHh3pNoycwEbWJXM4Wh0dyOroqfGOSZIQHLVojbrIzPU4+idbjP21swTiRZUETG3Vqeh/f9GOKGzmMK8gp22cU5fdYS4WqVI/CzElox2gikQlNrPFLJiV5HlZroCAg++qtWtW/f6DXWpu/4u5IpwoSDkc/CSA",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2021-02-11T13:22:39.055034Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-11_05_22_37-10740710535990166025'
 location: 'us-central1'
 name: 'beamapp-jenkins-0211132231-624730'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-11T13:22:39.055034Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-11_05_22_37-10740710535990166025]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-11_05_22_37-10740710535990166025
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_05_22_37-10740710535990166025?project=apache-beam-testing
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16130492812135.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/859203ed-bb59-4c53-873a-879c570c9a01?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon5d650f9f1e1e3c7c83403986697ed8ea181512e2/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16130492812135.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_05_22_37-10740710535990166025 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:41.223Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-11_05_22_37-10740710535990166025. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:42.058Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-11_05_22_37-10740710535990166025.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:44.330Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.043Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.068Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.140Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.161Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.197Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.225Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.251Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.274Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.296Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.316Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.340Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.366Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.391Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.416Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.436Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.458Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.675Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.725Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.771Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:46.796Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:47.742Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:47.816Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:47.881Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:22:56.759Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:23:23.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:23:59.594Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:23:59.622Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:11.953Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:15.125Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:15.203Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:15.271Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:15.328Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:24.710Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:24.802Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:24.901Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:24.956Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:24:24.990Z: JOB_MESSAGE_BASIC: Stopping worker pool...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-11_05_09_54-7685648735299314989 after 902 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16130489832987.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/92677c13-be4a-442d-b3ee-bf7621740627?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon8883b22c_9a9c_4423_8606_86a410af0d39/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(0,), (1,), (2,), (3,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:25:10.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:25:10.761Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:25:10.827Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_05_17_58-16200641528600759906 is in state JOB_STATE_DONE
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16130489832987?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:04.539Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:04.721Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:04.892Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:04.965Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:14.063Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:14.124Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:14.190Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:14.233Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:30:14.267Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:31:05.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:31:05.504Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-11T13:31:05.545Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_05_22_37-10740710535990166025 is in state JOB_STATE_DONE
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4709.436s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 30m 53s
213 actionable tasks: 176 executed, 33 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/dymvkpkzgkiko

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #3533

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3533/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #3532

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3532/display/redirect?page=changes>

Changes:

[heejong] [BEAM-11520] Stage extra PyPI dependencies with generated requirements

[heejong] raise exception for non-file type artifacts


------------------------------------------
[...truncated 48.89 MB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWXI5+EkAApP/8H/////////////cwr///+ZrwCAAAEBAAvYZlANGGqYqeamp6IPUGGptT1GQADTQ0DQDRp6T1AHqeoBoPU9QZGTRkaCVNAmjQTTKeon6jJMTQAAAAAAAAAAAAA0aBoKaETI0GmJ6gAA0AAAAAAaNAAMgNNDQAHAAAAAAAAAAAAAAAAAAAAMjtJpzFFK4LYqyzjaFELa4VuSZq6LrLJsLgAp9yw6aAEwjBSTUyE4oiRSEJhDnlkr+AQNjGF8k++GNttBFVUVVI0QGgQgktElTZs51pkwhNCuGH3orT8g4ZUAQIYOSiGZlMMiCGAIz0OuasKyOuS9Ik8HI2BSHCiEFKMcXzQwDMHKitKyCim3MkNTwEZ9AUtsyJ+fibxgf1nfK3aotYaCkQpLJIovY7PmiTvRjxoxk1NIosScQM2oIpPHGOKAla0zdncrbnVmLLXfovDxyval1L1TvSprS8481pR3nwW5LjRT0UnvXsc4VQd3YYvCGaB08Rt9pulHKIEj0ILZuFtOO5cpK+QcL1jIqDguD44OS/8TqCWiAzig/mAiehklIPFsRKofUbX+K7pAI0G3EjKMe1tQovpjs/wTRhU1TACAFYLAIUJAHiS8Jm0ZKgmH7D+8SpKVoKnUZdIVTPwGfiHwxZzs5/VAz2lAusURJHd/Vx7TnKh4h2qbUaNcIjvyLapg9cBohcZBqAXVgwgwvllBlUMoIe3t/GoCmUxXGCuYtesFih2Qg159zLL2YbPyQzDwItV5jnsK5laCrC8stwii9NikbsSVXZRmCiVySa/ZBM96I9SjQEqaQMNj2DxyYAh3HqHox6lSlC0GtGToghKCOQE1VBHZ0GTeyAsTA9yl45wy8h/gfqKFCKoMqsD9AtKRJkLi2SSa6kXrczb6WlIwv/mlKwmnWvT2Ulf80XzRuv02tnmZc5dcJEVXUtHaa1zpmuHOauCQtV8qoiQXVVivQ4UEOmYMCPTV7rAghBHnCxDYCqKlxpwnGPHn25pghqWDG7HSooLYsHh3pNoycwEbWJXM4Wh0dyOroqfGOSZIQHLVojbrIzPU4+idbjP21swTiRZUETG3Vqeh/f9GOKGzmMK8gp22cU5fdYS4WqVI/CzElox2gikQlNrPFLJiV5HlZroCAg++qtWtW/f6DXWpu/4u5IpwoSDkc/CSA",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2021-02-13T01:27:52.715516Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-12_17_27_51-1887875199622226211'
 location: 'us-central1'
 name: 'beamapp-jenkins-0213012744-589491'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-13T01:27:52.715516Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-12_17_27_51-1887875199622226211]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-12_17_27_51-1887875199622226211
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_17_27_51-1887875199622226211?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_27_51-1887875199622226211 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:54.952Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-12_17_27_51-1887875199622226211. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:55.491Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-12_17_27_51-1887875199622226211.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:57.304Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.044Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.095Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.124Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.155Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.221Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.254Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.283Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.318Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.363Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.398Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.424Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.454Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.490Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.534Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.561Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.583Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.718Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.849Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.893Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:58.926Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:59.368Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:59.447Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:27:59.508Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:11.344Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:22.507Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:29.058Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:29.304Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:29.366Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:29.458Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-12_17_13_24-2462211534719356067 after 902 seconds
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:39.371Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:38.704Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:38.769Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:38.868Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:38.915Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:28:38.944Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16131787931697.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/23ed476d-b1af-40d2-9452-c4a94a836a0f?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonf2a07389_154a_481e_96ff_b40757a784bd/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(0,), (1,), (3,), (2,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:29:11.602Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:29:11.635Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:29:30.748Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:29:30.796Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:29:30.845Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_22_09-8832219831693019294 is in state JOB_STATE_DONE
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16131787931697?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:47.268Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:47.544Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:47.589Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:47.648Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:56.731Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:56.809Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:56.899Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:56.963Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:57.001Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:35:48.716Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:35:48.768Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:35:48.807Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_27_51-1887875199622226211 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 5382.164s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 35m 30s
210 actionable tasks: 155 executed, 51 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/42ujc5zx2u6ea

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #3531

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3531/display/redirect?page=changes>

Changes:

[nahian97] Changing query to sql to fix doc

[evgeny.belousov] [BEAM-11807] SDK Worker multithreading causes boto3 the KeyError


------------------------------------------
[...truncated 48.92 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.379Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.403Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.425Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.495Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.531Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.558Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.576Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.601Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.621Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.646Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.672Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.696Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.719Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.795Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:47.818Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.072Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.126Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.164Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.186Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.872Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:48.972Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:12:49.029Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:14.218Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:14.515Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:15.595Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:18.738Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:18.808Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:18.877Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:18.941Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:28.320Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:28.377Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:28.445Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:28.486Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:28.503Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:43.507Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/FilesToRemoveImpulse/Read+read/ReadFromBigQuery/MapFilesToRemove
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:43.772Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/MapFilesToRemove.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:43.854Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:43.894Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:43.971Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:46.695Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_15150083843017344677". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15150083843017344677".
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:50.572Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:50.609Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:57.589Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_15150083843017344677" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.221Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/Write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.277Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.330Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.379Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.439Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:13:58.501Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:04.371Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:04.426Z: JOB_MESSAGE_DEBUG: Executing success step success3
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:04.498Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:04.585Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:04.649Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:24.580Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:24.654Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:24.689Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_05_58-14609536640002496312 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:46.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:46.476Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:14:46.502Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_07_16-14939014758415487357 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16131568238536.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/d87ac504-c361-472b-b437-aaabb74668ce?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon48797f7d439eb012b6257ade10e163101b94fe45/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16131568238536.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-12_11_00_17-8181390980489537700 after 902 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_1613156406569.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/5388b8f5-1f3b-4d0e-b504-58ff23ece624?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon5affe21b_dc33_4bbb_bc5e_522b7b9b4993/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(1,), (2,), (3,), (0,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:37.956Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:41.109Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:41.243Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:41.303Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:41.370Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:50.724Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:50.797Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:50.913Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:50.985Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:15:51.047Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:42.300Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:42.343Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:42.369Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_09_28-5904644240812514554 is in state JOB_STATE_DONE
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_1613156406569?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:58.477Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:58.654Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:58.731Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:58.800Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:07.917Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:07.982Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:08.077Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:08.130Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:08.170Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:52.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:52.558Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:19:52.585Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_12_39-1204568464552118480 is in state JOB_STATE_DONE
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4560.010s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 42s
210 actionable tasks: 150 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/cvae6fuegmqww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #3530

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3530/display/redirect?page=changes>

Changes:

[David Morávek] Simplify LateDataDropping runner.


------------------------------------------
[...truncated 28.98 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:50.806Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine into GroupAndSum/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:50.834Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine/Extract into GroupAndSum/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:50.869Z: JOB_MESSAGE_DETAILED: Fusing consumer Format into GroupAndSum/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:50.919Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/WindowInto(WindowIntoFn) into Format
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:50.988Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/WriteBundles/WriteBundles into Write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.042Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Pair into Write/Write/WriteImpl/WriteBundles/WriteBundles
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Reify into Write/Write/WriteImpl/Pair
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.129Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Write into Write/Write/WriteImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.182Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/GroupByWindow into Write/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.224Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Extract into Write/Write/WriteImpl/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.263Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey into Write/Write/WriteImpl/PreFinalize/MapToVoidKey1
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.337Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write into Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.391Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap into Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.427Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey into Write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.468Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.510Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.561Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey into Write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.626Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.672Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.742Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey into Write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.791Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.834Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap into Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.880Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.928Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:51.966Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.016Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.528Z: JOB_MESSAGE_DEBUG: Executing wait step start129
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.661Z: JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/WriteBundles/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.708Z: JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.723Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.747Z: JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.771Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:52.803Z: JOB_MESSAGE_BASIC: Executing operation Read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.443Z: JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/WriteBundles/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.443Z: JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.465Z: JOB_MESSAGE_BASIC: Finished operation Read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.476Z: JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.584Z: JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/WriteBundles/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.632Z: JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.680Z: JOB_MESSAGE_DEBUG: Value "Read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.736Z: JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.808Z: JOB_MESSAGE_BASIC: Executing operation Read/Read/Impulse+Read/Read/Split+Read/Read/Reshuffle/AddRandomKeys+Read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.857Z: JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/DoOnce/Impulse+Write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2957>)+Write/Write/WriteImpl/DoOnce/Map(decode)+Write/Write/WriteImpl/InitializeWrite+Write/Write/WriteImpl/WriteBundles/MapToVoidKey0+Write/Write/WriteImpl/PreFinalize/MapToVoidKey0+Write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+Write/Write/WriteImpl/WriteBundles/MapToVoidKey0+Write/Write/WriteImpl/PreFinalize/MapToVoidKey0+Write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+Write/Write/WriteImpl/WriteBundles/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+Write/Write/WriteImpl/WriteBundles/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+Write/Write/WriteImpl/PreFinalize/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+Write/Write/WriteImpl/FinalizeWrite/_DataflowIterableAsMultimapSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T12:26:53.996Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started on localhost:43733'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on localhost:38251'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, terminate with Ctrl+C'
DEBUG:root:Waiting for grpc channel to be ready at localhost:38251.
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'flink_master' was already added
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Staging artifacts for job_37aa650d-da98-496f-a544-697604a26f57.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_37aa650d-da98-496f-a544-697604a26f57.external_10beam:env:docker:v1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 7 artifacts for job_37aa650d-da98-496f-a544-697604a26f57.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:32 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_37aa650d-da98-496f-a544-697604a26f57.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:32 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 0 artifacts for job_37aa650d-da98-496f-a544-697604a26f57.null.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:33 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 finishStaging'
INFO:apache_beam.utils.subprocess_server:b'INFO: Artifacts fully staged for job_37aa650d-da98-496f-a544-697604a26f57.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:34 PM org.apache.beam.runners.flink.FlinkJobInvoker invokeWithExecutor'
INFO:apache_beam.utils.subprocess_server:b'INFO: Invoking job BeamApp-jenkins-0212122633-2c564c56_7523f628-beb1-403a-8586-3377fa73d4e7 with pipeline runner org.apache.beam.runners.flink.FlinkPipelineRunner@7413c14b'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:34 PM org.apache.beam.runners.jobsubmission.JobInvocation start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting job invocation BeamApp-jenkins-0212122633-2c564c56_7523f628-beb1-403a-8586-3377fa73d4e7'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:34 PM org.apache.beam.runners.flink.FlinkPipelineRunner runPipelineWithTranslator'
INFO:apache_beam.utils.subprocess_server:b'INFO: Translating pipeline to Flink program.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:34 PM org.apache.beam.runners.flink.FlinkExecutionEnvironments createBatchExecutionEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Creating a Batch Execution Environment.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:35 PM org.apache.flink.api.java.utils.PlanGenerator logTypeRegistrationDetails'
INFO:apache_beam.utils.subprocess_server:b'INFO: The job has 0 registered types and 0 default Kryo serializers'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.cpu.cores required for local execution is not set, setting it to the maximal possible value.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.memory.task.heap.size required for local execution is not set, setting it to the maximal possible value.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.memory.task.off-heap.size required for local execution is not set, setting it to the maximal possible value.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.memory.network.min required for local execution is not set, setting it to its default value 64 mb.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.memory.network.max required for local execution is not set, setting it to its default value 64 mb.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.taskexecutor.TaskExecutorResourceUtils setConfigOptionToDefaultIfNotSet'
INFO:apache_beam.utils.subprocess_server:b'INFO: The configuration option taskmanager.memory.managed.size required for local execution is not set, setting it to its default value 128 mb.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.minicluster.MiniCluster start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting Flink Mini Cluster'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.minicluster.MiniCluster start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting Metrics Registry'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.metrics.MetricRegistryImpl <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: No metrics reporter configured, no metrics will be exposed/reported.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.minicluster.MiniCluster start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting RPC Service(s)'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:36 PM org.apache.flink.runtime.clusterframework.BootstrapTools startLocalActorSystem'
INFO:apache_beam.utils.subprocess_server:b'INFO: Trying to start local actor system'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1 applyOrElse'
INFO:apache_beam.utils.subprocess_server:b'INFO: Slf4jLogger started'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.clusterframework.BootstrapTools startActorSystem'
INFO:apache_beam.utils.subprocess_server:b'INFO: Actor system started at akka://flink'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.clusterframework.BootstrapTools startLocalActorSystem'
INFO:apache_beam.utils.subprocess_server:b'INFO: Trying to start local actor system'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM akka.event.slf4j.Slf4jLogger$$anonfun$receive$1 applyOrElse'
INFO:apache_beam.utils.subprocess_server:b'INFO: Slf4jLogger started'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.clusterframework.BootstrapTools startActorSystem'
INFO:apache_beam.utils.subprocess_server:b'INFO: Actor system started at akka://flink-metrics'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService startServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting RPC endpoint for org.apache.flink.runtime.metrics.dump.MetricQueryService at akka://flink-metrics/user/rpc/MetricQueryService .'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.minicluster.MiniCluster createHighAvailabilityServices'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting high-availability services'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.blob.BlobServer <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: Created BLOB server storage directory /tmp/blobStore-0468a36f-2539-48b8-909a-1b36e58b95d2'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.blob.BlobServer <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: Started BLOB server at 0.0.0.0:33615 - max concurrent requests: 50 - max backlog: 1000'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.blob.AbstractBlobCache <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: Created BLOB cache storage directory /tmp/blobStore-48a5405c-9469-4a9e-a9a6-98600b057097'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.blob.AbstractBlobCache <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: Created BLOB cache storage directory /tmp/blobStore-0b8f3d2e-a5a8-4e98-8dc4-a894b1972776'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.minicluster.MiniCluster startTaskManagers'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting 1 TaskManger(s)'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.taskexecutor.TaskManagerRunner startTaskManager'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting TaskManager with ResourceID: 9007f7ff-4a28-4938-a46c-c110d81c6779'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.taskexecutor.TaskManagerServices checkTempDirs'
INFO:apache_beam.utils.subprocess_server:b"INFO: Temporary file directory '/tmp': total 484 GB, usable 302 GB (62.40% usable)"
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl createFiles'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager uses directory /tmp/flink-io-6c118404-4329-4968-8864-a7607ab3ca33 for spill files.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl createFiles'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager uses directory /tmp/flink-netty-shuffle-140c22e0-af59-4958-83c2-c4498d81760e for spill files.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.io.network.buffer.NetworkBufferPool <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: Allocated 64 MB for network buffer pool (number of memory segments: 2048, bytes per segment: 32768).'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting the network environment and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.taskexecutor.KvStateService start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting the kvState service and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService startServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting RPC endpoint for org.apache.flink.runtime.taskexecutor.TaskExecutor at akka://flink/user/rpc/taskmanager_0 .'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Start job leader service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.filecache.FileCache <init>'
INFO:apache_beam.utils.subprocess_server:b'INFO: User file cache uses directory /tmp/flink-dist-cache-4ceefd2f-c493-4516-bc92-4fd7d010237f'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:37 PM org.apache.flink.runtime.rest.RestServerEndpoint start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Starting rest endpoint.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:38 PM org.apache.flink.runtime.webmonitor.WebMonitorUtils$LogFileLocation find'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Log file environment variable 'log.file' is not set."
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:38 PM org.apache.flink.runtime.webmonitor.WebMonitorUtils$LogFileLocation find'
INFO:apache_beam.utils.subprocess_server:b"WARNING: JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'web.log.path'."
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:38 PM org.apache.flink.runtime.rest.RestServerEndpoint start'
INFO:apache_beam.utils.subprocess_server:b'INFO: Rest endpoint listening at localhost:41323'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:38 PM org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService updateLeader'
INFO:apache_beam.utils.subprocess_server:b'INFO: Proposing leadership to contender http://localhost:41323'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:38 PM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint startInternal'
INFO:apache_beam.utils.subprocess_server:b'INFO: Web frontend listening at http://localhost:41323.'

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started on localhost:43733'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on localhost:38251'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, terminate with Ctrl+C'
DEBUG:root:Waiting for grpc channel to be ready at localhost:38251.
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'flink_master' was already added
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started on localhost:43733'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on localhost:38251'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, terminate with Ctrl+C'
DEBUG:root:Waiting for grpc channel to be ready at localhost:38251.
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'flink_master' was already added
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Staging artifacts for job_37aa650d-da98-496f-a544-697604a26f57.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_37aa650d-da98-496f-a544-697604a26f57.external_10beam:env:docker:v1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:31 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 7 artifacts for job_37aa650d-da98-496f-a544-697604a26f57.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:32 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_37aa650d-da98-496f-a544-697604a26f57.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:32 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 0 artifacts for job_37aa650d-da98-496f-a544-697604a26f57.null.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:33 PM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 finishStaging'
INFO:apache_beam.utils.subprocess_server:b'INFO: Artifacts fully staged for job_37aa650d-da98-496f-a544-697604a26f57.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 12:26:34 PM org.apache.beam.runners.flink.FlinkJobInvoker invokeWithExecutor'
INFO:apache_beam.utils.subprocess_server:b'INFO: Invoking job BeamApp-jenkins-0212122633-2c564c56_7523f628-beb1-403a-8586-3377fa73d4e7 with pipeline runner org.apache.beam.runners.flink.FlinkPipelineRunner@7413c14b'
java.lang.OutOfMemoryError: GC overhead limit exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #3529

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3529/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10925] Don't publish udf-test-provider to Maven.

[noreply] [BEAM-11780] Use vendored cloudbuild python client. (#13933)

[noreply] [BEAM-11804] Remove vendors/sdk-java-extensions-protobuf (#13968)

[noreply] [BEAM-7372][BEAM-9372] cleanup python 2.x and 3.5 codepaths (#13913)


------------------------------------------
[...truncated 29.18 MB...]
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)'
INFO:apache_beam.utils.subprocess_server:b'\tat org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)'
INFO:apache_beam.utils.subprocess_server:b'\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)'
INFO:apache_beam.utils.subprocess_server:b'\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)'
INFO:apache_beam.utils.subprocess_server:b'\t... 1 more'
INFO:apache_beam.utils.subprocess_server:b''
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.rest.handler.legacy.backpressure.BackPressureRequestCoordinator shutDown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down back pressure request coordinator.'
ERROR:root:java.lang.RuntimeException: Error received from SDK harness for instruction 48: org.apache.beam.sdk.util.UserCodeException: java.lang.IllegalArgumentException: Multiple entries with same key: user-agent=Apache_Beam_Java/2.29.0-SNAPSHOT and user-agent=spanner-java/
	at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
	at org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema$DoFnInvoker.invokeSetup(Unknown Source)
	at org.apache.beam.fn.harness.FnApiDoFnRunner.<init>(FnApiDoFnRunner.java:473)
	at org.apache.beam.fn.harness.FnApiDoFnRunner$Factory.createRunnerForPTransform(FnApiDoFnRunner.java:183)
	at org.apache.beam.fn.harness.FnApiDoFnRunner$Factory.createRunnerForPTransform(FnApiDoFnRunner.java:157)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:247)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:208)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:208)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.createBundleProcessor(ProcessBundleHandler.java:518)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.lambda$processBundle$0(ProcessBundleHandler.java:287)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler$BundleProcessorCache.get(ProcessBundleHandler.java:598)
	at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:282)
	at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
	at org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Multiple entries with same key: user-agent=Apache_Beam_Java/2.29.0-SNAPSHOT and user-agent=spanner-java/
	at com.google.common.collect.ImmutableMap.conflictException(ImmutableMap.java:215)
	at com.google.common.collect.ImmutableMap.checkNoConflict(ImmutableMap.java:209)
	at com.google.common.collect.RegularImmutableMap.checkNoConflictInKeyBucket(RegularImmutableMap.java:147)
	at com.google.common.collect.RegularImmutableMap.fromEntryArray(RegularImmutableMap.java:110)
	at com.google.common.collect.ImmutableMap$Builder.build(ImmutableMap.java:393)
	at com.google.cloud.spanner.spi.v1.GapicSpannerRpc.<init>(GapicSpannerRpc.java:320)
	at com.google.cloud.spanner.SpannerOptions$DefaultSpannerRpcFactory.create(SpannerOptions.java:467)
	at com.google.cloud.spanner.SpannerOptions$DefaultSpannerRpcFactory.create(SpannerOptions.java:462)
	at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:561)
	at com.google.cloud.spanner.SpannerOptions.getSpannerRpcV1(SpannerOptions.java:1169)
	at com.google.cloud.spanner.SpannerImpl.<init>(SpannerImpl.java:134)
	at com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:457)
	at com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:452)
	at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:541)
	at org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor.createAndConnect(SpannerAccessor.java:163)
	at org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor.getOrCreate(SpannerAccessor.java:98)
	at org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema.setup(ReadSpannerSchema.java:45)

INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-50bb49d0-6bbc-4f56-a9ae-bfc3c2b19334'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down the network environment and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-07f8daae-4fe4-40fa-b313-3ce18bd032db'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down the kvState service and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stop job leader service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: removed file cache directory /tmp/flink-dist-cache-53a3deb9-7eac-468f-8c17-ad121ff7d886'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopping Akka RPC service.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopping Akka RPC service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped Akka RPC service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down BLOB cache'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down BLOB cache'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.blob.BlobServer close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped BLOB server at 0.0.0.0:44201'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped Akka RPC service.'
ERROR
test_spanner_update (apache_beam.io.gcp.tests.xlang_spannerio_it_test.CrossLanguageSpannerIOTest) ... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/java/io/google-cloud-platform/expansion-service/build/libs/beam-sdks-java-io-google-cloud-platform-expansion-service-2.29.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/java/io/google-cloud-platform/expansion-service/build/libs/beam-sdks-java-io-google-cloud-platform-expansion-service-2.29.0-SNAPSHOT.jar'> '56211']
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at localhost:56211'
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external transforms: [beam:external:java:pubsub:read:v1, beam:external:java:pubsub:write:v1, beam:external:java:spanner:insert:v1, beam:external:java:spanner:update:v1, beam:external:java:spanner:replace:v1, beam:external:java:spanner:insert_or_update:v1, beam:external:java:spanner:delete:v1, beam:external:java:spanner:read:v1, beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:pubsub:read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@1f89ab83'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:pubsub:write:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@e73f9ac'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:insert:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@61064425'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:update:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@7b1d7fff'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:replace:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@299a06ac'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:insert_or_update:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@383534aa'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:delete:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@6bc168e5'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:read:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@7b3300e5'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@2e5c649'
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:04 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 'Write to Spanner' with URN 'beam:external:java:spanner:update:v1'"
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:06 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 'org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar$WriteBuilder$Configuration' has no schema registered. Attempting to construct with setter approach."
DEBUG:root:Sending SIGINT to job_server
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:44301
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7ff5f5428a60> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_Impulse_2\n  Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Generate_3\n  Generate:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map to row_4\n  Map to row:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/MapElements/Map/ParMultiDo(Anonymous)\n  Write to Spanner/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup)\n  Write to Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/Impulse\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema/ParMultiDo(ReadSpannerSchema)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema/ParMultiDo(ReadSpannerSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/RewindowIntoGlobal/Window.Assign\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/RewindowIntoGlobal/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable Mutations\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable Mutations:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create Batches/ParMultiDo(GatherSortCreateBatches)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create Batches/ParMultiDo(GatherSortCreateBatches):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to Spanner\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to Spanner:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7ff5f54291e0> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_Impulse_2\n  Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Generate_3\n  Generate:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map to row_4\n  Map to row:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/MapElements/Map/ParMultiDo(Anonymous)\n  Write to Spanner/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup)\n  Write to Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/Impulse\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema/ParMultiDo(ReadSpannerSchema)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema/ParMultiDo(ReadSpannerSchema):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine:beam:transform:combine_per_key_precombine:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/RewindowIntoGlobal/Window.Assign\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/RewindowIntoGlobal/Window.Assign:beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable Mutations\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable Mutations:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create Batches/ParMultiDo(GatherSortCreateBatches)\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create Batches/ParMultiDo(GatherSortCreateBatches):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to Spanner\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to Spanner:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/flink/1.12/job-server/build/libs/beam-runners-flink-1.12-job-server-2.29.0-SNAPSHOT.jar'> '--flink-master' '[auto]' '--artifacts-dir' '/tmp/beam-temphfpzpv53/artifacts0k_0qv07' '--job-port' '49827' '--artifact-port' '0' '--expansion-port' '0']
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:27:56.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:02.318Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.729Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.774Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.811Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_22_19_42-10931258595129165273 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... PyVersion ---> 3.6.8 (default, Dec 24 2018, 19:24:27) 
[GCC 5.4.0 20160609]
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... Setting up!
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... Spanner Client created!
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Creating test database: pybeam-read-d9e1eb2c484d3a4
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin HTTP/1.1" 200 241
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Creating database: Done! name: "projects/apache-beam-testing/instances/beam-test/databases/pybeam-read-d9e1eb2c484d3a4"
state: READY
create_time {
  seconds: 1613111310
  nanos: 952774000
}

INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Dummy Data: Adding dummy data...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-11_22_22_19-8483241608154996889 after 360 seconds
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data HTTP/1.1" 200 241
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Spanner Read IT Setup Complete...
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Running Spanner via sql
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:126: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
  sql="select * from Users")
INFO:apache_beam.runners.portability.stager:Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createArtifactStagingService'
INFO:apache_beam.utils.subprocess_server:b'INFO: ArtifactStagingService started on localhost:43877'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started on localhost:42347'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on localhost:49827'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, terminate with Ctrl+C'
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'flink_master' was already added
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Staging artifacts for job_2cd01418-0514-4560-8fe9-b59918c4d6aa.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_2cd01418-0514-4560-8fe9-b59918c4d6aa.external_10beam:env:docker:v1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 7 artifacts for job_2cd01418-0514-4560-8fe9-b59918c4d6aa.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for job_2cd01418-0514-4560-8fe9-b59918c4d6aa.ref_Environment_default_environment_1.'

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createArtifactStagingService'
INFO:apache_beam.utils.subprocess_server:b'INFO: ArtifactStagingService started on localhost:43877'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started on localhost:42347'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on localhost:49827'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, terminate with Ctrl+C'
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'output_executable_path' was already added
java.lang.OutOfMemoryError: GC overhead limit exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #3528

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3528/display/redirect?page=changes>

Changes:

[sychen] Integrate BigQuery sink file loads with GroupIntoBatches

[chamikaramj] Fixes a checkstyle error in UdfTestProvider

[noreply] [BEAM-11647] Fix go:goBuild gradle rules for build collision (#13958)

[noreply] [BEAM-11611] Add transformation for computing approximate quantiles.


------------------------------------------
[...truncated 52.01 MB...]
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/63db47b1-9777-4e27-8d90-648a1cb8596b?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon88b15247b4c1eeae302be8362de5407806bc57de/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16130929589602.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:31:39.511Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:32:01.621Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:32:01.657Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-11_17_18_54-7523743327145872731 after 901 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16130927057658.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/e5d0e8ed-cee0-4078-899a-fe4fc306ab18?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7f61c323_aeeb_4d7d_80ef_d0d637a85861/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(2,), (1,), (0,), (3,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:34:51.943Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:34:55.087Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:34:55.153Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:34:55.203Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:34:55.274Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:35:04.621Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:35:04.708Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:35:04.795Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:35:04.844Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:35:04.872Z: JOB_MESSAGE_BASIC: Stopping worker pool...
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16130927057658?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:03.999Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:04.070Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:04.103Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_28_16-18379670723856657804 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:56.984Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:57.275Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:57.328Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:36:57.388Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:06.459Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:06.564Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:06.647Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:06.699Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:06.797Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:50.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:50.307Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:37:50.331Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_30_50-2518247285197667716 is in state JOB_STATE_DONE
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_12-12464484560428781212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_33_48-9563169671223276733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_41_31-18176197672550954742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_49_30-11887594259023081667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_57_40-14230103719315289510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_06_28-13439566737958316317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_15_19-16110847211739352764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_23_29-6753957911805381513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_13-14895034912401978560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_40_45-17841655622130133555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_52_00-17917604901300067121?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_01_17-4609306607130835337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_10_42-11669480206395729429?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_20_13-1508775441609466563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_12-3528923500515313751?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_31_49-7786064087042097221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_40_57-10283029248124923944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_49_39-5450102025756739100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_57_56-8225735703687410682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_06_00-14186084235363518330?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_14_19-9068926466613648021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_22_49-12835011281093869351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_05-6666259357687498714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_41_32-18433748992154729188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_52_27-6279290833192545019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_00_57-1366907050984765791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_09_15-7520907720613156893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_19_02-17717521272370176844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_27_13-13537445006973098316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_26_46-14167890648672002905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_36_47-5844573092951654549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_44_41-8324829595210515324?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_52_33-11300501664017950373?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_02_01-16897479815861363123?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_18_54-7523743327145872731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_08-15147454220422581267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_29_12-17106973129063396197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_38_40-3456521468514791037?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_48_02-3593167300178092018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_56_20-9449510220533454101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_05_38-3893073181835013579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_13_45-11037658742783064193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_23_05-3895201068604191735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_18-11515605008009207045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_28_40-6792865386514731396?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_40_14-3980829019486153319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_50_15-16100226895180187887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_58_22-9337343074012165772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_07_51-4380697502982411076?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_15_16-17010470248725757900?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_22_57-572572997238307328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_30_50-2518247285197667716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_20_07-13489226222914746867?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_28_51-7881098678410217211?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_37_49-3538431826825061149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_45_51-8134209416440730442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_16_53_42-11256440510234509919?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_02_17-10466226381823033492?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_10_57-5790033962046039902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_19_53-13298404101611245870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_28_16-18379670723856657804?project=apache-beam-testing
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4732.973s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 37m 36s
213 actionable tasks: 164 executed, 45 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/jdl4yhuqas4qi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python36 - Build # 3527 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python36 - Build # 3527 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python36/3527/ to view the results.