You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/10/01 00:59:18 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #604

See <https://builds.apache.org/job/beam_PostCommit_Python36/604/display/redirect>

Changes:


------------------------------------------
[...truncated 84.81 KB...]
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py",> line 149, in test_big_query_legacy_sql
    big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py",> line 82, in run_bq_pipeline
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",> line 51, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Test pipeline expected terminated in state: DONE and Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
     but: Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual checksum is da39a3ee5e6b4b0d3255bfef95601890afd80709

-------------------- >> begin captured logging << --------------------
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Any
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Query SELECT * FROM (SELECT "apple" as fruit), (SELECT "orange" as fruit), does not reference any tables.
root: WARNING: Dataset apache-beam-testing:temp_dataset_c5a4acf6352540999333141989e01d1a does not exist so we will create it as temporary with location=None
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_query_to_table_15698881157154'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'> with schema {'fields': [{'name': 'fruit', 'type': 'STRING', 'mode': 'NULLABLE'}]}.
root: DEBUG: Created the table with id output_table
root: INFO: Created table apache-beam-testing.python_query_to_table_15698881157154.output_table with schema <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>. Result: <Table
 creationTime: 1569888120471
 etag: '2SbUTP4Xxr1mXEWQm/E1AQ=='
 id: 'apache-beam-testing:python_query_to_table_15698881157154.output_table'
 kind: 'bigquery#table'
 lastModifiedTime: 1569888120528
 location: 'US'
 numBytes: 0
 numLongTermBytes: 0
 numRows: 0
 schema: <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>
 selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15698881157154/tables/output_table'
 tableReference: <TableReference
 datasetId: 'python_query_to_table_15698881157154'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'>
 type: 'TABLE'>.
root: DEBUG: Attempting to flush to all destinations. Total buffered: 2
root: DEBUG: Flushing data to apache-beam-testing:python_query_to_table_15698881157154.output_table. Total 2 rows.
root: DEBUG: Passed: True. Errors are []
root: INFO: Attempting to perform query SELECT fruit from `python_query_to_table_15698881157154.output_table`; to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/c848fc5a-b797-4e7d-98ff-583778f378a0?maxResults=0&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anone1bc664367144cb1e075dcfa92f94b973ac4ff4f/data HTTP/1.1" 200 None
root: INFO: Read from given query (SELECT fruit from `python_query_to_table_15698881157154.output_table`;), total rows 0
root: INFO: Generate checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 29.700s<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location


FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py36:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_14-16359215888678100710?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_16_54-13997730143534146658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_25_08-17533529402016909260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_36_01-1635463575327295550?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_44_53-7387273507595242583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_15-15975706468136730383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_23_35-10396180277934185824?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_41_54-17404092798620021114?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_14-15554256948144778489?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_15_04-14437764426093667500?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_23_51-512657190599437694?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_36_06-10246936613798024985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_44_47-15239513807669147236?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_09-17813610893139213434?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_23_05-10881050842293936907?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_34_41-840340341538378335?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_09-10779071306352846664?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_11_26-5230398196621972591?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_20_35-14739580767926918297?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_29_44-8668162551507393548?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_40_28-1835712471549131160?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_09-15067236847547505793?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_10_39-9955903867363195253?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_19_22-12867398725010498554?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_28_28-12557138248633281161?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_39_35-3784611839266459240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_15-7551547838623334886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_11_11-1769754580659101244?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_18_40-18210215537690003075?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_27_05-15984762949987679318?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_37_30-7716946955079508803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_02_12-12486334998358544011?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_11_01-2824463109677661984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_19_09-6995064845872155166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_27_48-2139258262588237675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_37_35-8628220660025764891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_17_48_43-16770718613704464500?project=apache-beam-testing
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3439.640s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 20s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/7cqv5ocyml24m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #607

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/607/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #606

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/606/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-8021] Swap build-tools to be compile only so it isn't a "required"


------------------------------------------
[...truncated 466.73 KB...]
root: INFO: 2019-10-01T09:52:20.254Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.282Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.307Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.337Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.371Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.406Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-10-01T09:52:20.451Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-10-01T09:52:20.491Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-10-01T09:52:20.766Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-10-01T09:52:20.835Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-10-01T09:52:20.878Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
root: INFO: 2019-10-01T09:52:20.906Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
root: INFO: 2019-10-01T09:52:20.946Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
root: INFO: 2019-10-01T09:52:20.979Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions
root: INFO: 2019-10-01T09:52:21.013Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
root: INFO: 2019-10-01T09:52:21.048Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions
root: INFO: 2019-10-01T09:52:21.086Z: JOB_MESSAGE_DETAILED: Unzipping flatten s21 for input s15.out_WrittenFiles
root: INFO: 2019-10-01T09:52:21.122Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-10-01T09:52:21.156Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-10-01T09:52:21.190Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-10-01T09:52:21.225Z: JOB_MESSAGE_DETAILED: Unzipping flatten s59 for input s53.out_WrittenFiles
root: INFO: 2019-10-01T09:52:21.263Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-10-01T09:52:21.298Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-10-01T09:52:21.335Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-10-01T09:52:21.370Z: JOB_MESSAGE_DETAILED: Unzipping flatten s21-u71 for input s22-reify-value9-c69
root: INFO: 2019-10-01T09:52:21.404Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-10-01T09:52:21.441Z: JOB_MESSAGE_DETAILED: Unzipping flatten s59-u76 for input s60-reify-value45-c74
root: INFO: 2019-10-01T09:52:21.477Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-10-01T09:52:21.511Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RewindowIntoGlobal into FlatMap(<lambda at bigquery_file_loads_test.py:589>)
root: INFO: 2019-10-01T09:52:21.546Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RewindowIntoGlobal into FlatMap(<lambda at bigquery_file_loads_test.py:589>)
root: INFO: 2019-10-01T09:52:21.580Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-10-01T09:52:21.613Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-10-01T09:52:21.647Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-10-01T09:52:21.682Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-10-01T09:52:21.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at bigquery_file_loads_test.py:587>) into Create/Read
root: INFO: 2019-10-01T09:52:21.749Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into Map(<lambda at bigquery_file_loads_test.py:587>)
root: INFO: 2019-10-01T09:52:21.783Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
root: INFO: 2019-10-01T09:52:21.818Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
root: INFO: 2019-10-01T09:52:21.849Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at bigquery_file_loads_test.py:589>) into GroupByKey/GroupByWindow
root: INFO: 2019-10-01T09:52:21.878Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RewindowIntoGlobal
root: INFO: 2019-10-01T09:52:21.908Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination
root: INFO: 2019-10-01T09:52:21.943Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-10-01T09:52:21.972Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-10-01T09:52:22.010Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-10-01T09:52:22.044Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-10-01T09:52:22.079Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-10-01T09:52:22.115Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-10-01T09:52:22.152Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests/BigQueryBatchFileLoads/RewindowIntoGlobal
root: INFO: 2019-10-01T09:52:22.179Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-10-01T09:52:22.208Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-10-01T09:52:22.234Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-10-01T09:52:22.259Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-10-01T09:52:22.296Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-10-01T09:52:22.331Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-10-01T09:52:22.365Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-10-01T09:52:22.399Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-10-01T09:52:22.434Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-10-01T09:52:22.465Z: JOB_MESSAGE_DETAILED: Fusing siblings WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs and WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-10-01T09:52:22.504Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>) into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-10-01T09:52:22.534Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-10-01T09:52:22.554Z: JOB_MESSAGE_DETAILED: Fusing siblings WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs and WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-10-01T09:52:22.577Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-10-01T09:52:22.603Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-10-01T09:52:22.640Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-10-01T09:52:22.676Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-10-01T09:52:22.710Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-10-01T09:52:22.747Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-10-01T09:52:22.782Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-10-01T09:52:22.817Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-10-01T09:52:22.855Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-10-01T09:52:22.889Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-10-01T09:52:22.915Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-10-01T09:52:22.942Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-10-01T09:52:22.978Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-10-01T09:52:23.015Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-10-01T09:52:23.042Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-10-01T09:52:23.077Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-10-01T09:52:23.114Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-10-01T09:52:23.151Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-10-01T09:52:23.192Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-10-01T09:52:23.231Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-10-01T09:52:23.488Z: JOB_MESSAGE_DEBUG: Executing wait step start99
root: INFO: 2019-10-01T09:52:23.549Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-10-01T09:52:23.586Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)
root: INFO: 2019-10-01T09:52:23.599Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-10-01T09:52:23.610Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-10-01T09:52:23.632Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-10-01T09:52:23.645Z: JOB_MESSAGE_BASIC: Executing operation MakeSchemas/Read
root: INFO: 2019-10-01T09:52:23.671Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-10-01T09:52:23.684Z: JOB_MESSAGE_BASIC: Finished operation MakeSchemas/Read
root: INFO: 2019-10-01T09:52:23.692Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-10-01T09:52:23.705Z: JOB_MESSAGE_BASIC: Executing operation MakeTables/Read
root: INFO: 2019-10-01T09:52:23.723Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-10-01T09:52:23.742Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-10-01T09:52:23.745Z: JOB_MESSAGE_BASIC: Finished operation MakeTables/Read
root: INFO: 2019-10-01T09:52:23.769Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
root: INFO: 2019-10-01T09:52:23.787Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-10-01T09:52:23.799Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-10-01T09:52:23.817Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
root: INFO: 2019-10-01T09:52:23.830Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-10-01T09:52:23.849Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-10-01T09:52:23.849Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-10-01T09:52:23.873Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-10-01T09:52:23.881Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-10-01T09:52:23.905Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-10-01T09:52:23.906Z: JOB_MESSAGE_DEBUG: Value "MakeSchemas/Read.out" materialized.
root: INFO: 2019-10-01T09:52:23.944Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-10-01T09:52:23.979Z: JOB_MESSAGE_DEBUG: Value "MakeTables/Read.out" materialized.
root: INFO: 2019-10-01T09:52:24.009Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-10-01T09:52:24.045Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
root: INFO: 2019-10-01T09:52:24.081Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-10-01T09:52:24.115Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-10-01T09:52:24.154Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-10-01T09:52:24.179Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.211Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.214Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.236Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.250Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.265Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.267Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.295Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.297Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.332Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:587>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-10-01T09:52:24.335Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-10-01T09:52:24.360Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-10-01T09:52:24.397Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-10-01T09:52:24.422Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-10-01T09:52:24.453Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-10-01T09:52:24.479Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-10-01T09:53:15.950Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 1250.0 in region us-central1.
root: INFO: 2019-10-01T09:53:15.988Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-10-01T09:53:16.071Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:587>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-10-01T09:53:16.322Z: JOB_MESSAGE_WARNING: S01:WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix failed.
root: INFO: 2019-10-01T09:53:16.327Z: JOB_MESSAGE_WARNING: S08:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>) failed.
root: INFO: 2019-10-01T09:53:16.358Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-10-01T09:53:16.367Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)
root: INFO: 2019-10-01T09:53:16.487Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-10-01T09:53:16.546Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-10-01T09:53:16.572Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-10-01T09:53:35.177Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-10-01T09:53:35.213Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-10-01_02_52_14-14203309271753942152 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_file_loads_15699235217501 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_53-11222076425446829631?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_56_23-8519457308441887426?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_07_02-16772285793346545229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_16_25-11363090169980466213?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_26_52-16202505105353109990?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_48-2184084193681017605?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_01_28-16028421228330362048?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_11_09-5520265713209520395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_20_49-12129637668285769944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_52-3753165556453982490?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_54_40-1439584062197347175?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_57_03-16553205891500110430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_07_01-12976759673814634698?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_17_00-6603887878380756378?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_48-15790443439536183519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_01_00-16114806303003462173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_19_53-4740449438492757017?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_48-193120021006866896?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_52_14-14203309271753942152?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_53_58-1305030054767435587?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_05_50-12854537977644582613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_15_54-15094852408348666479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_47-15303116306288240326?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_49_37-17103354711379993018?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_59_43-1437142295479528751?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_10_35-13203584903223549028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_57-16268891273282746696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_50_26-15635289984437197497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_00_20-3306151632249871328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_09_49-388956491406346082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_19_00-2114340832024831453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_29_14-9439693287997090102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_39_50-9605907524632422877?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_50_39-14992259536358365248?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_02_59_09-1997271787356439419?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_09_53-12145101152749437139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-01_03_20_32-7618358677721959059?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3583.943s

FAILED (SKIP=6, errors=2)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 42s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/mqob4b7pvobjw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #605

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/605/display/redirect>

Changes:


------------------------------------------
[...truncated 162.43 KB...]
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputfef7880b-7e9d-4cb0-9635-6b2e337d2709"
          },
          {
            "key": "with_attributes",
            "label": "With Attributes",
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource",
            "type": "BOOLEAN",
            "value": false
          }
        ],
        "format": "pubsub",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "ReadFromPubSub/Read.out"
          }
        ],
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputfef7880b-7e9d-4cb0-9635-6b2e337d2709",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "StreamingUserMetricsDoFn",
            "type": "STRING",
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "generate_metrics.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4",
        "user_name": "generate_metrics"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputfef7880b-7e9d-4cb0-9635-6b2e337d2709",
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-10-01T06:36:53.982886Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-09-30_23_36_52-16416105006813009089'
 location: 'us-central1'
 name: 'beamapp-jenkins-1001063643-966034'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-10-01T06:36:53.982886Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-09-30_23_36_52-16416105006813009089]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_36_52-16416105006813009089?project=apache-beam-testing
root: INFO: Job 2019-09-30_23_36_52-16416105006813009089 is in state JOB_STATE_RUNNING
root: INFO: 2019-10-01T06:36:56.564Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-01T06:36:57.319Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-10-01T06:36:58.868Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-10-01T06:36:58.870Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-10-01T06:36:58.878Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-10-01T06:36:58.888Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-10-01T06:36:58.891Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-10-01T06:36:58.894Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-10-01T06:36:58.910Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-10-01T06:36:58.913Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
root: INFO: 2019-10-01T06:36:58.915Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
root: INFO: 2019-10-01T06:36:58.925Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-10-01T06:36:58.946Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-10-01T06:36:58.996Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-10-01T06:36:59.157Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-10-01T06:36:59.176Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-10-01T06:36:59.183Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-10-01T06:37:02.622Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+generate_metrics+dump_to_pub/Write/NativeWrite
root: WARNING: Timing out on waiting for job 2019-09-30_23_36_52-16416105006813009089 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_36-2256935935032773335?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_19_53-16940546204886884688?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_30_12-1305157335559429021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_40_24-13521942317066394976?project=apache-beam-testing
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_31-12293729860352431620?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_26_44-8357148510941151808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_35_42-14864620869702607122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_45_37-1137412749098251840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_34-9824852779770665252?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_17_03-4742401790714650104?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_27_51-5133138165066673550?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_36_52-13913286847248270416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_31-7722797472780387737?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_26_18-17249355167489421030?project=apache-beam-testing
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_30-2864011291764797640?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_13_03-4244960054398061733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_24_21-14429038169972970644?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_34_40-9861639803441561361?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_30-17891739034831633794?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_13_43-13515266202209236998?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_17_05-13782885384511570961?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_27_03-5830423512544445344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_36_52-16416105006813009089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_35-15122841469746203690?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_14_13-15675573329390790323?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_23_52-13346269825534968754?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_33_39-4617614117480183241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_42_28-8294305823936045485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_03_32-4659189316018098353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_14_24-9036380935490887897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_25_52-5979823578620395721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_35_46-14434946755486868600?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-30_23_46_08-5592384589980673262?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3143.454s

FAILED (SKIP=6, errors=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 18s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/dt4a434iwdjgq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org