You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/01/04 19:15:19 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #3375

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3375/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11457] Add option to skip key-value clone (#13543)


------------------------------------------
[...truncated 40.43 MB...]
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT number FROM python_pubsub_bq_16097866901803.output_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/45c991d3-5169-4bb7-bdc6-1b66fd1bbd3d?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon032aff78_aabc_4870_94a8_c5a62f863314/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(1,), (0,), (3,), (2,)]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:14.165Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:14.252Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:14.323Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:14.384Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:23.493Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:23.571Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:23.672Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:23.732Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:14:23.770Z: JOB_MESSAGE_BASIC: Stopping worker pool...
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16097866901803?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:15:03.911Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:15:03.949Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-01-04T19:15:03.986Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-01-04_11_08_11-11837047645283451001 is in state JOB_STATE_DONE
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ERROR
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

======================================================================
ERROR: test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_it_test.py",> line 70, in test_bigquery_read_1M_python
    self.run_bigquery_io_read_pipeline('1M')
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_it_test.py",> line 62, in run_bigquery_io_read_pipeline
    test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py",> line 95, in run
    assert_that(count, equal_to([known_args.num_records]))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 582, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 532, in run
    self._options).run(False)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 561, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 621, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 655, in create_job
    self.create_job_description(job)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 711, in create_job_description
    resources = self._stage_resources(job.proto_pipeline, job.options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 608, in _stage_resources
    resources, staging_location=google_cloud_options.staging_location)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 316, in stage_job_resources
    file_path, FileSystems.join(staging_location, staged_path))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 976, in stage_artifact
    local_path_to_artifact, artifact_name)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 572, in _gcs_file_copy
    self.stage_file(to_folder, to_name, f, total_size=total_size)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 633, in stage_file
    response = self._storage_client.objects.Insert(request, upload=upload)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1156, in Insert
    upload=upload, upload_config=upload_config)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py",> line 715, in _RunMethod
    http_request, client=self.client)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py",> line 908, in InitializeUpload
    return self.StreamInChunks()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py",> line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py",> line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py",> line 875, in RefreshResumableUploadState
    raise exceptions.HttpError.FromResponse(refresh_response)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0104182425-739198.1609784665.739516%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-Uxy03DcdwdXIzUYleptZAe1NFtKBxYxBOzuXUJT1UbC0KygRPDzWUbqBa3Tef9M7Vjq3QiBAQZSf53SUqy8Tf_oMfCp8g>: response: <{'x-guploader-uploadid': 'ABg5-Uxy03DcdwdXIzUYleptZAe1NFtKBxYxBOzuXUJT1UbC0KygRPDzWUbqBa3Tef9M7Vjq3QiBAQZSf53SUqy8Tf_oMfCp8g', 'content-type': 'application/json; charset=UTF-8', 'date': 'Mon, 04 Jan 2021 18:24:41 GMT', 'vary': 'Origin, X-Origin', 'cache-control': 'no-cache, no-store, max-age=0, must-revalidate', 'expires': 'Mon, 01 Jan 1990 00:00:00 GMT', 'pragma': 'no-cache', 'content-length': '0', 'server': 'UploadServer', 'status': '410'}>, content <>
-------------------- >> begin captured logging << --------------------
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.28.0.dev
root: INFO: Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36:beam-master-20201214
root: INFO: Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36:beam-master-20201214" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function eliminate_common_key_with_none at 0x7f3da5f727b8> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 32 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_read/FilesToRemoveImpulse/Impulse_4\n  read/FilesToRemoveImpulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>)_5\n  read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/Map(decode)_7\n  read/FilesToRemoveImpulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/MapFilesToRemove_8\n  read/MapFilesToRemove:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Impulse_10\n  read/Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Map(<lambda at iobase.py:899>)_11\n  read/Read/Map(<lambda at iobase.py:899>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)_13\n  read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_16\n  read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_18\n  read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>)_19\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_21\n  read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)_22\n  read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_row to string_23\n  row to string:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/CombinePerKey_27\n  count/CombineGlobally(CountCombineFn)/CombinePerKey:beam:transform:combine_per_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/UnKey_30\n  count/CombineGlobally(CountCombineFn)/UnKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Impulse_32\n  count/CombineGlobally(CountCombineFn)/DoOnce/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>)_33\n  count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode)_35\n  count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/InjectDefault_36\n  count/CombineGlobally(CountCombineFn)/InjectDefault:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_39\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_40\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_42\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_43\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_44\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_46\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_47\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_48\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_49\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_50\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_51\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_52\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/KeyWithVoid_26\n  count/CombineGlobally(CountCombineFn)/KeyWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function pack_combiners at 0x7f3da5f72840> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 32 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_read/FilesToRemoveImpulse/Impulse_4\n  read/FilesToRemoveImpulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>)_5\n  read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/Map(decode)_7\n  read/FilesToRemoveImpulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/MapFilesToRemove_8\n  read/MapFilesToRemove:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Impulse_10\n  read/Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Map(<lambda at iobase.py:899>)_11\n  read/Read/Map(<lambda at iobase.py:899>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)_13\n  read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_16\n  read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_18\n  read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>)_19\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_21\n  read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)_22\n  read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_row to string_23\n  row to string:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/UnKey_30\n  count/CombineGlobally(CountCombineFn)/UnKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Impulse_32\n  count/CombineGlobally(CountCombineFn)/DoOnce/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>)_33\n  count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode)_35\n  count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/InjectDefault_36\n  count/CombineGlobally(CountCombineFn)/InjectDefault:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_39\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_40\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_42\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_43\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_44\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_46\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_47\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_48\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_49\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_50\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_51\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_52\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/KeyWithVoid_26\n  count/CombineGlobally(CountCombineFn)/KeyWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/CombinePerKey_27\n  count/CombineGlobally(CountCombineFn)/CombinePerKey:beam:transform:combine_per_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7f3da5f74048> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 32 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_read/FilesToRemoveImpulse/Impulse_4\n  read/FilesToRemoveImpulse/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>)_5\n  read/FilesToRemoveImpulse/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/FilesToRemoveImpulse/Map(decode)_7\n  read/FilesToRemoveImpulse/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/MapFilesToRemove_8\n  read/MapFilesToRemove:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Impulse_10\n  read/Read/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/Map(<lambda at iobase.py:899>)_11\n  read/Read/Map(<lambda at iobase.py:899>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)_13\n  read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_16\n  read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_18\n  read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>)_19\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_21\n  read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)_22\n  read/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_row to string_23\n  row to string:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/KeyWithVoid_26\n  count/CombineGlobally(CountCombineFn)/KeyWithVoid:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/CombinePerKey_27\n  count/CombineGlobally(CountCombineFn)/CombinePerKey:beam:transform:combine_per_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/UnKey_30\n  count/CombineGlobally(CountCombineFn)/UnKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Impulse_32\n  count/CombineGlobally(CountCombineFn)/DoOnce/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>)_33\n  count/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode)_35\n  count/CombineGlobally(CountCombineFn)/DoOnce/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_count/CombineGlobally(CountCombineFn)/InjectDefault_36\n  count/CombineGlobally(CountCombineFn)/InjectDefault:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_39\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_40\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_42\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_43\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_44\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_46\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_47\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_48\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_49\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_50\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_51\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_52\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/pipeline.pb...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/pipeline.pb in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/requirements.txt...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/requirements.txt in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/parameterized-0.7.4.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/parameterized-0.7.4.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/pbr-5.5.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/pbr-5.5.1.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/mock-2.0.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/mock-2.0.0.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/parameterized-0.7.5.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/parameterized-0.7.5.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/six-1.15.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/six-1.15.0.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/PyHamcrest-1.10.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/PyHamcrest-1.10.1.tar.gz in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/dataflow_python_sdk.tar...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/dataflow_python_sdk.tar in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0104182425-739198.1609784665.739516/dataflow-worker.jar...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4255.004s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 46s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/wlmgqbefbf4aa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #3376

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3376/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org