You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/02/12 01:28:03 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #883

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/883/display/redirect?page=changes>

Changes:

[sychen] Integrate BigQuery sink file loads with GroupIntoBatches

[chamikaramj] Fixes a checkstyle error in UdfTestProvider

[noreply] [BEAM-11647] Fix go:goBuild gradle rules for build collision (#13958)

[noreply] [BEAM-11611] Add transformation for computing approximate quantiles.


------------------------------------------
[...truncated 47.76 MB...]
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_12_31-16834233999587771251 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2021-02-12T01:20:13.662261Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-11_17_20_09-6375821105215016352'
 location: 'us-central1'
 name: 'beamapp-jenkins-0212012003-290899'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-12T01:20:13.662261Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-11_17_20_09-6375821105215016352]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-11_17_20_09-6375821105215016352
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_17_20_09-6375821105215016352?project=apache-beam-testing
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16130923384030.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/75b15a54-7022-4ef3-bbaf-d05df060a0b1?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon125dcf6b9aae248116e7f1d6d6b3d73bd29abfba/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16130923384030.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:18.624Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:18.718Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:18.900Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:18.972Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:19.007Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_20_09-6375821105215016352 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:15.810Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-11_17_20_09-6375821105215016352. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:16.536Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-11_17_20_09-6375821105215016352.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:19.305Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.062Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.109Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.168Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.233Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.351Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.420Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.448Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.497Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.539Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.578Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.621Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.660Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.690Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.740Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.781Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:21.842Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.057Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.147Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.199Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.242Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.882Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:22.989Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:23.068Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:52.223Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:20:58.636Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:21:12.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:21:12.939Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:21:12.973Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_13_26-8566067238909770210 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:21:34.588Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:21:34.633Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-11_17_08_14-8774986632650021931 after 900 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16130920823517.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/dcab9683-a1a5-4b6c-9ac7-64171ade8f2e?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon9de13fdc_c7f3_4ede_b341_e61a72b48847/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(0,), (3,), (2,), (1,)]
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16130920823517?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:50.140Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:50.423Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:50.501Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:50.609Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:59.754Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:59.848Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:26:59.945Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:27:00.029Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:27:00.072Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:27:49.445Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:27:49.498Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T01:27:49.531Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_17_20_09-6375821105215016352 is in state JOB_STATE_DONE
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4783.018s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 27m 41s
214 actionable tasks: 159 executed, 51 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/vctdf2qpd24ce

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #888

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/888/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #887

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/887/display/redirect?page=changes>

Changes:

[heejong] [BEAM-11520] Stage extra PyPI dependencies with generated requirements

[heejong] raise exception for non-file type artifacts


------------------------------------------
[...truncated 47.63 MB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYJmaxgABPZKJBYnJGanxSamJuXrJ+SmpRcVQakqPsFticUlAUWZuZklmWWqxM0h4yuQpmo1Taqf08MenJebkJCUmZ8eD1U/JYOjhDshMzs5JRVGYVJykBwDOUCqY",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYJmaxgABPZKJBYnJGanxSamJuXrJ+SmpRcVQakqPsFticUlAUWZuZklmWWqxM0h4yuQpmo1Taqf08MenJebkJCUmZ8eD1U/JYOjhDshMzs5JRVGYVJykBwDOUCqY",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2021-02-13T01:31:29.504818Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-12_17_31_27-477230034659411030'
 location: 'us-central1'
 name: 'beamapp-jenkins-0213013120-639050'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-13T01:31:29.504818Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-12_17_31_27-477230034659411030]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-12_17_31_27-477230034659411030
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_17_31_27-477230034659411030?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_31_27-477230034659411030 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:31.688Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-12_17_31_27-477230034659411030. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:31.782Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-12_17_31_27-477230034659411030.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:34.025Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:35.769Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:35.812Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:35.850Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:35.893Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:35.964Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.007Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.035Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.072Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.106Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.138Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.197Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.234Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.258Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.283Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.322Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.355Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.559Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.620Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.665Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:36.699Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:37.059Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:37.139Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:31:37.235Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:32:08.311Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:32:12.933Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:32:48.482Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:32:48.513Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:33:52.054Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-12_17_18_38-12229857750236795588 after 903 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16131791067744.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/592fcae5-37f0-49c3-aaf6-42c8b726c0d3?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anone1d66967_8e58_4ee1_8e6f_a9b32c2e5bdc/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(0,), (2,), (3,), (1,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:33:58.738Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:33:58.801Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:33:58.857Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:33:58.930Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:08.351Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:08.436Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:08.513Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:08.606Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:08.628Z: JOB_MESSAGE_BASIC: Stopping worker pool...
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16131791067744?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:58.730Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:58.777Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:34:58.823Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_26_19-6873107238513231978 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:21.944Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:21.998Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:22.041Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:22.098Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:31.410Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:31.495Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:31.569Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:31.735Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:38:31.774Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:39:21.106Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:39:21.156Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-13T01:39:21.190Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_17_31_27-477230034659411030 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 5802.368s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 39m 11s
211 actionable tasks: 151 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/joyw4owf34mms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #886

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/886/display/redirect?page=changes>

Changes:

[nahian97] Changing query to sql to fix doc

[evgeny.belousov] [BEAM-11807] SDK Worker multithreading causes boto3 the KeyError


------------------------------------------
[...truncated 46.94 MB...]
        "serialized_fn": "QlpoOTFBWSZTWdfr1D4AAeH/8P//eCtAY5lNjencgr//3+ZjxC4AYEAAQEACmTUG2Mg0TRGpqNlMnpDTanqGhiY1A9TTIDQaaDQ2poAASpoCnoAFPTVPIymTJptR6jIDag0NABoyaAMgxTSNRqbSDQaGmgA9QBo0yAAAADRpkDjQ0DRpkaaNMgMTBAADQGgNMgMCZD0L9ag7CuKta8zHIRSEDwxuHMZSINFcN0tB/WzMvdzmRdRAcHBHpRAegJAikTWRjRFwi9WkiPLuQmPsgUZGYNmJaM+lIbndwLfa/cPW7aH7HIzUtTJjkgkqVadzAiBcdmyjWNPjleEZrwVsQQgKNa3n4YZbCm+E8V25Rhwimuy2VcWky2RrQl2cl/C40PXOPGD50v1WAdZ4fLDTf9qAJIGhK5YeW/f506XHs0CcFBA9DjC8kNVhE9bTfc14stYnedJCU0lp21jwfqd7d/g3a1qcTZ5rRKUoWm2ldIRdUyq7NtZMOFSsrSYXWoMCDvNol+LnKxcY7Q3nI1qQrW8cNFmsSPc9fVobYXmQI1vtUwy0Av3QwCFHAeJFFSlTxM2MpukLnrBY65Br+YS7Ii+5zkNhXLjYrIZ5rSvyKtMZbdX1E5Ekv2L6WYGWLEXDl7ISNeiPZ0LUG4JU8iclz2D3JWqR0RiTRETgpUEHvDGMp6ASoGL2WjRF5XlFPIZCS04oMFDVnlpZ2RvE+QiAgqUiL2JRwB5QFMaDrth/ROjp7TsTlbsmvU4wGLngGpqiuyEBpNOzEglNF7NZgxu7vTTG5Kwermh2HTlKCUFs7YtW3NzaRllFUIyYzGGIctWuN4scrpEHDVQOJdRMDzPVnHorRyeiajXP8t+aXrzli7yvRsldTnYT8J4MOCDZKGKN/PgQlJpqYTUwMahjkMQa6qoiJENNBkhKvhAtUc9MhGgfFAi1O/bY/nEUpquJbn5d3pxdIsYybogSogkidI1GHHPpg8jtpvTrjNDQm/btVLqmXihF6+9VpjhB8hTLcw76OkgiGrOqRX18f/F3JFOFCQ1+vUPg",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:01.470Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:01.519Z: JOB_MESSAGE_DEBUG: Executing success step success3
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:01.583Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:01.734Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:01.763Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2021-02-12T19:16:04.120371Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-02-12_11_16_02-17181807000752086401'
 location: 'us-central1'
 name: 'beamapp-jenkins-0212191555-167763'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-02-12T19:16:04.120371Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-12_11_16_02-17181807000752086401]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-12_11_16_02-17181807000752086401
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_11_16_02-17181807000752086401?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_16_02-17181807000752086401 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:06.344Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-12_11_16_02-17181807000752086401. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:06.450Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-12_11_16_02-17181807000752086401.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:08.946Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.511Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.673Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.710Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.740Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.820Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.877Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.952Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:09.994Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.026Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.060Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.095Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.127Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.164Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.191Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.222Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.247Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.581Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.639Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.682Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:10.718Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:11.156Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:11.241Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:11.298Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:43.298Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:43.344Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:43.395Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:44.789Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_08_04-4468296123825005347 is in state JOB_STATE_DONE
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-12_11_01_35-6938131510786021408 after 902 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16131568699658.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/24d3caed-919c-4f6e-a98a-fef9d2e1514d?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16131564772125.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon155c2910110de2add8218d88dcd73f4b93440c66/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16131568699658.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/6246bcc0-2261-4448-b193-0a38aee0cefa?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon9c09bbfb_f189_4024_a560_96b522ae2789/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(0,), (1,), (2,), (3,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:16:52.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:16.253Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:16.286Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:30.751Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:33.952Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:34.108Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:34.179Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:34.256Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:43.755Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:43.825Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:43.937Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:43.998Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:17:44.037Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:32.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:32.675Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:18:32.718Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_10_57-7919292035663140413 is in state JOB_STATE_DONE
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16131564772125?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:40.435Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:40.530Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:40.612Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:40.673Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:49.936Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:50.038Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:50.102Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:50.155Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:23:50.188Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:24:35.153Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:24:35.207Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T19:24:35.243Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_11_16_02-17181807000752086401 is in state JOB_STATE_DONE
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4788.560s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 23s
211 actionable tasks: 157 executed, 50 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/h662nf323psle

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #885

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/885/display/redirect?page=changes>

Changes:

[David Morávek] Simplify LateDataDropping runner.


------------------------------------------
[...truncated 47.93 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.677Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.710Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.751Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.768Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.792Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.825Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.855Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.879Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.909Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:35.934Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.134Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.200Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.256Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.283Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.627Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.697Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:36.760Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-12_05_04_43-8864135780838708137 after 902 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16131350692403.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:19:57.232Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/07eaec5f-6d73-46fb-a09d-fdc8c7cbb88e?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7b81683a_ffa3_476c_ac47_d006efb1867b/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(3,), (2,), (1,), (0,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:20:02.183Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:20:34.805Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:20:34.831Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16131350692403?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:21:50.397Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:21:57.026Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:21:57.152Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:21:57.235Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:21:57.328Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:06.569Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:06.662Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:06.813Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:06.939Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:06.971Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:56.340Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:56.413Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:22:56.471Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_05_15_52-3161792730476028403 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:02.895Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:03.136Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:03.196Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:03.259Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:12.497Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:12.598Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:12.682Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:12.736Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:26:12.765Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:27:09.899Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:27:09.937Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T13:27:09.960Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-12_05_19_29-4136756640163243412 is in state JOB_STATE_DONE
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_43-6869948100456967042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_22_15-10310973713032071269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_30_00-14693800353920273985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_37_48-15361781575796819374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_45_53-16271552939628640449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_54_25-14668311675422224435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_02_59-11642910905707730662?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_11_26-17230714849976792770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_19_29-4136756640163243412?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_43-2494960538764069445?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_29_59-11825409205392527813?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_39_20-8404507169094058005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_47_53-6446150946397379232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_04_43-8864135780838708137?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_40-16710670123390162829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_29_37-4157989683335915617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_37_37-15425508805602294065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_46_43-11713351159167738807?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_55_26-7616501113624039984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_02_40-8215378599197107274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_10_08-594183661049875935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_43-9776849657378610279?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_20_11-10482096825331666322?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_28_36-2233077456471616210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_37_07-7825787299411015243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_45_24-15852805671547211459?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_52_54-8932238713308534504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_01_05-5125016884721849875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_09_32-13644042669768285330?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_10_31-11797736898390685718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_19_40-8435191400277762193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_28_14-14213024006291729130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_38_29-12798267374093077191?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_47_52-9889549381615483475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_57_55-16367703109517209844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_07_09-1304919923280053954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_15_52-3161792730476028403?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_41-16558271256779745260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_18_27-2794358562631509371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_27_42-10812147338862447004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_36_13-10241456705405333886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_44_46-4546402935423608313?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_53_10-13074986175660614563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_01_19-9463917682357049183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_09_59-5843011103811821837?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_43-1110062779642531572?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_18_01-9104159087150398990?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_26_00-13204248948121149182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_33_50-18286222985559700196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_41_21-11108657248064267525?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_49_05-2002180699696898103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_57_47-16942306887724425892?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_07_28-3192003710728166070?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_08_40-10266872962113406369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_17_16-3547389921303938472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_27_56-8228669551841117664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_38_52-16224763735251643217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_46_42-16399050764601698166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_04_57_14-17672766178847084293?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_05_56-1541095586883154457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-12_05_14_01-3652758501427467600?project=apache-beam-testing
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4741.630s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 26m 47s
211 actionable tasks: 162 executed, 45 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/ls6zx2wxoyx7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #884

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/884/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10925] Don't publish udf-test-provider to Maven.

[noreply] [BEAM-11780] Use vendored cloudbuild python client. (#13933)

[noreply] [BEAM-11804] Remove vendors/sdk-java-extensions-protobuf (#13968)

[noreply] [BEAM-7372][BEAM-9372] cleanup python 2.x and 3.5 codepaths (#13913)


------------------------------------------
[...truncated 47.43 MB...]
 stageStates: []
 startTime: '2021-02-12T07:12:40.315428Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2021-02-11_23_12_38-17831089953775427068]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2021-02-11_23_12_38-17831089953775427068
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-11_23_12_38-17831089953775427068?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.253Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/FilesToRemoveImpulse/Read+read/ReadFromBigQuery/MapFilesToRemove
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.649Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/MapFilesToRemove.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.817Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.881Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.957Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.391Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_1957157966973948711". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_1957157966973948711".
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_23_12_38-17831089953775427068 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.474Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-02-11_23_12_38-17831089953775427068. The number of workers will be between 1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:42.771Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-02-11_23_12_38-17831089953775427068.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.145Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.836Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.890Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.944Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:45.985Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.054Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.098Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.126Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.157Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.195Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.240Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.273Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.309Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.341Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.373Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.407Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.442Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.629Z: JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.708Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.766Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:46.804Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:47.345Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:47.424Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:47.521Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:56.144Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_1957157966973948711" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:56.902Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/Write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:56.983Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:57.066Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:57.133Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:57.211Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:12:57.298Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:03.261Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:03.436Z: JOB_MESSAGE_DEBUG: Executing success step success3
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:03.889Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:04.067Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:04.128Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:17.890Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:22.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:56.016Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:56.067Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:13:56.103Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_23_05_44-4317322350113131654 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16131135313440.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:14:06.868Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:14:06.926Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/5a628c67-bdc8-40cb-be6d-debcc979ae37?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon2aeea87b777392cad373048fba5525a9d2c6f85f/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16131135313440.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-02-11_22_59_35-11025617870226829109 after 900 seconds
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16131131657910.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/cfd660b8-70b0-41d6-b4d4-050810736089?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon2494fe78_faea_4987_87a2_1d1c2f6a9ac6/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(2,), (1,), (3,), (0,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:15.725Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:19.124Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:19.350Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:19.510Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:19.617Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:28.517Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:28.635Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:28.719Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:28.767Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:15:28.799Z: JOB_MESSAGE_BASIC: Stopping worker pool...
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16131131657910?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:16:13.317Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:16:13.371Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:16:13.419Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_23_08_18-13861554709401810303 is in state JOB_STATE_DONE
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:45.627Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:46.430Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:46.490Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:46.553Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:55.175Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:55.251Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:55.339Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:55.412Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:19:55.456Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:20:39.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:20:39.213Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T07:20:39.238Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-02-11_23_12_38-17831089953775427068 is in state JOB_STATE_DONE
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4688.899s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 197

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py38:postCommitPy38IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 30s
211 actionable tasks: 151 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/fabntxrcoxl3c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org