You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/06/09 07:34:50 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #1297

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1297/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12439] Reuse Java job servers in spark_runner.py.


------------------------------------------
[...truncated 44.85 MB...]
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/d8a85fbf-f005-4b26-ae7f-dd20895ffe0d?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/d8a85fbf-f005-4b26-ae7f-dd20895ffe0d?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16232235483788 in project apache-beam-testing
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_flight_delays (apache_beam.examples.dataframe.flight_delays_it_test.FlightDelaysTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_aggregation (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok
test_enrich (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ERROR
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

======================================================================
ERROR: test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py",> line 89, in test_inspection
    assert_that(output.info_type, equal_to(['EMAIL_ADDRESS']), 'Type matches')
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 585, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    result = super(TestPipeline, self).run(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 537, in run
    return Pipeline.from_runner_api(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 564, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    self.result = super(TestDataflowRunner,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 582, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/retry.py",> line 253, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 657, in create_job
    self.create_job_description(job)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 743, in create_job_description
    resources = self._stage_resources(job.proto_pipeline, job.options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 609, in _stage_resources
    staged_resources = resource_stager.stage_job_resources(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 382, in stage_job_resources
    self.stage_artifact(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 1013, in stage_artifact
    self._dataflow_application_client._gcs_file_copy(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/retry.py",> line 253, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 559, in _gcs_file_copy
    self.stage_file(to_folder, to_name, f, total_size=total_size)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 635, in stage_file
    response = self._storage_client.objects.Insert(request, upload=upload)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1152, in Insert
    return self._RunMethod(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",> line 714, in _RunMethod
    http_response = upload.InitializeUpload(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 908, in InitializeUpload
    return self.StreamInChunks()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 1018, in StreamInChunks
    return self.__StreamMedia(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 957, in __StreamMedia
    response = send_func(self.stream.tell())
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 942, in CallSendChunk
    return self.__SendChunk(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 1120, in __SendChunk
    return self.__SendMediaRequest(request, end)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/transfer.py",> line 1031, in __SendMediaRequest
    response = http_wrapper.MakeRequest(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/http_wrapper.py",> line 348, in MakeRequest
    return _MakeRequestNoRetry(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/http_wrapper.py",> line 397, in _MakeRequestNoRetry
    info, content = http.request(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/oauth2client/transport.py",> line 173, in new_request
    resp, content = request(orig_request_method, uri, method, body,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/oauth2client/transport.py",> line 280, in request
    return http_callable(uri, method=method, body=body, headers=headers,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/oauth2client/transport.py",> line 173, in new_request
    resp, content = request(orig_request_method, uri, method, body,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/oauth2client/transport.py",> line 280, in request
    return http_callable(uri, method=method, body=body, headers=headers,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py",> line 1708, in request
    (response, content) = self._request(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py",> line 1424, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/httplib2/__init__.py",> line 1376, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python3.8/http/client.py", line 1347, in getresponse
    response.begin()
  File "/usr/lib/python3.8/http/client.py", line 307, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.8/http/client.py", line 268, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.8/socket.py", line 669, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.8/ssl.py", line 1241, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.8/ssl.py", line 1099, in read
    return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
-------------------- >> begin captured logging << --------------------
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.8 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.8_sdk:2.32.0.dev
root: INFO: Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python38:beam-master-20210526
root: INFO: Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python38:beam-master-20210526" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function pack_combiners at 0x7fc067292820> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 17 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create-Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2962-_4\n  Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-Map-decode-_6\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_InspectForDetails-ParDo-_InspectFn-_8\n  InspectForDetails/ParDo(_InspectFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_ParDo-CallableWrapperDoFn-ParDo-CallableWrapperDoFn-_10\n  ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-Impulse_13\n  Type matches/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-FlatMap-lambda-at-core-py-2962-_14\n  Type matches/Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-Map-decode-_16\n  Type matches/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-WindowInto-WindowIntoFn-_17\n  Type matches/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-ToVoidKey_18\n  Type matches/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-pair_with_0_20\n  Type matches/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-pair_with_1_21\n  Type matches/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-Flatten_22\n  Type matches/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-GroupByKey_23\n  Type matches/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-Map-_merge_tagged_vals_under_key-_24\n  Type matches/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Unkey_25\n  Type matches/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Match_26\n  Type matches/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7fc067295040> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 17 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create-Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2962-_4\n  Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create-Map-decode-_6\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_InspectForDetails-ParDo-_InspectFn-_8\n  InspectForDetails/ParDo(_InspectFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_ParDo-CallableWrapperDoFn-ParDo-CallableWrapperDoFn-_10\n  ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-Impulse_13\n  Type matches/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-FlatMap-lambda-at-core-py-2962-_14\n  Type matches/Create/FlatMap(<lambda at core.py:2962>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Create-Map-decode-_16\n  Type matches/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-WindowInto-WindowIntoFn-_17\n  Type matches/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-ToVoidKey_18\n  Type matches/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-pair_with_0_20\n  Type matches/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-pair_with_1_21\n  Type matches/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-Flatten_22\n  Type matches/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-GroupByKey_23\n  Type matches/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Group-Map-_merge_tagged_vals_under_key-_24\n  Type matches/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Unkey_25\n  Type matches/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Type-matches-Match_26\n  Type matches/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/requirements.txt...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/requirements.txt in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/pbr-5.6.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/pbr-5.6.0.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/mock-2.0.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/mock-2.0.0.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/six-1.16.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/six-1.16.0.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/soupsieve-2.2.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/soupsieve-2.2.1.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/PyHamcrest-1.10.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/PyHamcrest-1.10.1.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/parameterized-0.7.5.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/parameterized-0.7.5.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/beautifulsoup4-4.9.3.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/beautifulsoup4-4.9.3.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/dataflow_python_sdk.tar...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/dataflow_python_sdk.tar in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0609071517-540697.1623222917.540839/dataflow-worker.jar...
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0609071517-540697.1623222917.540839%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-UxX6Dy2Dpb2AW2r3h74TlWNQYJZUja6VztMd01WLdB-i76n2sLWzej1UKqHhSLGM3u-_vn9xEA7wFEc_ANtB2g after exception The read operation timed out
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0609071517-540697.1623222917.540839%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-UxX6Dy2Dpb2AW2r3h74TlWNQYJZUja6VztMd01WLdB-i76n2sLWzej1UKqHhSLGM3u-_vn9xEA7wFEc_ANtB2g after exception The read operation timed out
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0609071517-540697.1623222917.540839%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-UxX6Dy2Dpb2AW2r3h74TlWNQYJZUja6VztMd01WLdB-i76n2sLWzej1UKqHhSLGM3u-_vn9xEA7wFEc_ANtB2g after exception The read operation timed out
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0609071517-540697.1623222917.540839%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-UxX6Dy2Dpb2AW2r3h74TlWNQYJZUja6VztMd01WLdB-i76n2sLWzej1UKqHhSLGM3u-_vn9xEA7wFEc_ANtB2g after exception The read operation timed out
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 71 tests in 5336.487s

FAILED (SKIP=8, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 126

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 51s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/apw6xf44xenn2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #1299

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1299/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1298

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1298/display/redirect>

Changes:


------------------------------------------
[...truncated 44.94 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.448Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.475Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.497Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.529Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.550Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.563Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.565Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.598Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.599Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.632Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.665Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.670Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.710Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.744Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.777Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:28.800Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:37.541Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.673Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.735Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.755Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.816Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.872Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:41.932Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:42.019Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.456Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.523Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.591Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.666Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.739Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:48.817Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.522Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.703Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.771Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.835Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.905Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:52.976Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:59.752Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:59.826Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:59.900Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:34:59.980Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:00.069Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:00.147Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:00.406Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:00.473Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:00.544Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.156Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.230Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.295Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.364Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.650Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.720Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.823Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.881Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:01.910Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:49.353Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:49.389Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-09T13:35:49.415Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-06-09_06_26_46-17681213013954082920 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16232451901170.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/021671c8-dd0d-4bec-b013-dc52c413ca09?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/021671c8-dd0d-4bec-b013-dc52c413ca09?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16232451901170 in project apache-beam-testing
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_flight_delays (apache_beam.examples.dataframe.flight_delays_it_test.FlightDelaysTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_aggregation (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok
test_enrich (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 71 tests in 5502.201s

OK (SKIP=8Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_44-13678976735335055890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_18_08-15644576992813186229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_27_05-4707336481498399068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_36_12-12365762234949247209?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_45_55-11013343498689288418?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_56_11-9534955463466832190?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_05_24-17380413772817558346?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_15_07-16641917173463824568?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_44-14678393647277088461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_29_30-13768845087251308063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_38_44-2215765774229868703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_47_48-907221918843954261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_58_44-8277346201649640955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_06_21-3518849082273352843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_13_51-14855126854891193080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_43-7843795200245049102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_16_37-7732389314600039087?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_25_46-9222914184390977431?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_34_33-1910629713808543203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_42_55-5506064562036107536?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_50_47-1619397461271669299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_59_59-5847612245716682079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_09_00-1667367781045805326?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_17_30-1892642184922178917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_41-7781980441095747902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_24_10-7081761849046954832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_32_07-2970212450095692957?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_39_33-13721104139362301528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_48_06-13863343801248857765?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_57_14-14720187161481049630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_06_11-7749108770528817150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_14_30-17498953153425735682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_06_40-6666211015649203392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_17_18-12191581591878127193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_27_13-11923364298228153397?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_38_28-1520045791763834217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_47_27-7295349423854349636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_56_40-14859384653678583816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_06_29-4178363489758295027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_16_17-9965109310601032080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_26_46-17681213013954082920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_43-17024344299870572661?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_13_59-10145327056464897861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_31_22-16968734600041456636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_54_17-14163612103068341588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_11_03-13234381256395012914?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_44-15604339861264617978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_14_57-12500650148853739741?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_26_04-5169267900011327984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_36_21-8808077582557580829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_46_43-6981265057306913791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_55_05-7101653021885852736?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_04_31-10803263264837710581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_14_39-11948843958911421244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_22_46-7933852894479537714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_04_42-10347924131670749215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_13_45-4352412810849530765?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_22_40-14741217966275152368?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_30_43-9215262225238842050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_39_32-6814146773749742406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_48_30-6252712616034580749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_05_57_16-17771603787016415951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_06_06-9094417779483636840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-09_06_13_59-8194710204303443650?project=apache-beam-testing
)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 144

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 35m 35s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/q3yozejdwnv2s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org