You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/16 13:06:41 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #473

See <https://builds.apache.org/job/beam_PostCommit_Python37/473/display/redirect>

------------------------------------------
[...truncated 79.62 KB...]
> Task :sdks:python:test-suites:direct:py37:hdfsIntegrationTest
[15112] Failed to execute script docker-compose
Traceback (most recent call last):
  File "bin/docker-compose", line 6, in <module>
  File "compose/cli/main.py", line 71, in main
  File "compose/cli/main.py", line 127, in perform_command
  File "compose/cli/main.py", line 287, in build
  File "compose/project.py", line 386, in build
  File "compose/project.py", line 368, in build_service
  File "compose/service.py", line 1084, in build
  File "site-packages/docker/api/build.py", line 260, in build
  File "site-packages/docker/api/build.py", line 307, in _set_auth_headers
  File "site-packages/docker/auth.py", line 310, in get_all_credentials
  File "site-packages/docker/auth.py", line 262, in _resolve_authconfig_credstore
  File "site-packages/docker/auth.py", line 287, in _get_store_instance
  File "site-packages/dockerpycreds/store.py", line 25, in __init__
dockerpycreds.errors.InitializationError: docker-credential-gcloud not installed or not available in PATH
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-473 --no-ansi down
Removing network hdfs_it-jenkins-beam_postcommit_python37-473_test_net
Network hdfs_it-jenkins-beam_postcommit_python37-473_test_net not found.

real	0m0.730s
user	0m0.619s
sys	0m0.103s

> Task :sdks:python:test-suites:direct:py37:hdfsIntegrationTest FAILED

> Task :sdks:python:test-suites:dataflow:py37:installGcpTest
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.0 docopt-0.6.2 fastavro-0.21.24 fasteners-0.15 future-0.17.1 google-api-core-1.14.2 google-apitools-0.5.28 google-auth-1.6.3 google-cloud-bigquery-1.17.0 google-cloud-bigtable-1.0.0 google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.0 google-resumable-media-0.4.0 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.2 oauth2client-3.0.0 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.14.1 pyasn1-0.4.7 pyasn1-modules-0.2.6 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.2 python-dateutil-2.8.0 pytz-2019.2 pyyaml-3.13 requests-2.22.0 rsa-4.0 tenacity-5.1.1 urllib3-1.25.3

> Task :sdks:python:test-suites:direct:py37:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 25.031s

OK (SKIP=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.17.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_47-17095848976259733211?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_21_31-6320719833525934504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_31_34-9197350960815037655?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_41_31-4913288843517911592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_50_38-16766802343582891243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_43-5449315457297933619?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_31_32-12470804946044629851?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_42_09-8537255866499469063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_52_23-17907075844252586735?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_46-11781474408674389616?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_19_15-6807237494254705880?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_29_20-6376578468860874931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_48_37-18248499595035480166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_43-6626151349690162437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_28_42-4987220983262403924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_40_07-8414681231615055380?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_43-986594692075140276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_16_05-10734845338133236046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_25_24-10303487754434050457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_35_38-18190718120404447198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_44_45-1516721865333900545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_42-9098464841493792095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_16_10-4728200378261784585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_25_48-879332670382883832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_35_31-11408095374416024490?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_45_04-5275781171398743065?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_43-13451533571866653631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_16_25-6307923663386401425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_28_32-8478779528567591095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_37_57-11692396294900031686?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_48_33-118107226477510376?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_05_49-16584678055340682634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_16_06-15549853221548425567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_26_48-8311364187106187786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_36_55-9234526890361171011?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_47_22-3197533611130103746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_05_56_35-12515399198487474280?project=apache-beam-testing
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3668.417s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 61

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 6s
65 actionable tasks: 48 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/uzjnf3v5qosty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #474

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python37/474/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-8178] Switch to a new gradle task that builds python docker image

------------------------------------------
[...truncated 622.74 KB...]
root: INFO: 2019-09-16T14:22:27.165Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-09-16T14:22:27.191Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-09-16T14:22:27.221Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-09-16T14:22:27.248Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-09-16T14:22:27.284Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests/BigQueryBatchFileLoads/RewindowIntoGlobal
root: INFO: 2019-09-16T14:22:27.319Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-09-16T14:22:27.356Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-09-16T14:22:27.392Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-09-16T14:22:27.424Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-09-16T14:22:27.451Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-09-16T14:22:27.488Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-09-16T14:22:27.526Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-09-16T14:22:27.559Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-09-16T14:22:27.589Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-09-16T14:22:27.623Z: JOB_MESSAGE_DETAILED: Fusing siblings WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs and WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-09-16T14:22:27.659Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>) into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-09-16T14:22:27.694Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
root: INFO: 2019-09-16T14:22:27.720Z: JOB_MESSAGE_DETAILED: Fusing siblings WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs and WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-09-16T14:22:27.747Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-09-16T14:22:27.782Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
root: INFO: 2019-09-16T14:22:27.821Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-09-16T14:22:27.858Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-09-16T14:22:27.885Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-09-16T14:22:27.916Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-09-16T14:22:27.953Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-09-16T14:22:27.990Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-09-16T14:22:28.013Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-09-16T14:22:28.041Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-09-16T14:22:28.069Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-09-16T14:22:28.101Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-09-16T14:22:28.133Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-09-16T14:22:28.169Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-09-16T14:22:28.197Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-09-16T14:22:28.231Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-09-16T14:22:28.275Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-09-16T14:22:28.304Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-09-16T14:22:28.337Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-09-16T14:22:28.374Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-09-16T14:22:28.569Z: JOB_MESSAGE_DEBUG: Executing wait step start99
root: INFO: 2019-09-16T14:22:28.644Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-09-16T14:22:28.671Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)
root: INFO: 2019-09-16T14:22:28.683Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-09-16T14:22:28.700Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-09-16T14:22:28.711Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-09-16T14:22:28.729Z: JOB_MESSAGE_BASIC: Executing operation MakeSchemas/Read
root: INFO: 2019-09-16T14:22:28.761Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-09-16T14:22:28.766Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-09-16T14:22:28.766Z: JOB_MESSAGE_BASIC: Finished operation MakeSchemas/Read
root: INFO: 2019-09-16T14:22:28.791Z: JOB_MESSAGE_BASIC: Executing operation MakeTables/Read
root: INFO: 2019-09-16T14:22:28.822Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-09-16T14:22:28.826Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-09-16T14:22:28.829Z: JOB_MESSAGE_BASIC: Finished operation MakeTables/Read
root: INFO: 2019-09-16T14:22:28.848Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
root: INFO: 2019-09-16T14:22:28.874Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-09-16T14:22:28.877Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-09-16T14:22:28.892Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
root: INFO: 2019-09-16T14:22:28.907Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-09-16T14:22:28.927Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-09-16T14:22:28.937Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-09-16T14:22:28.958Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-09-16T14:22:28.967Z: JOB_MESSAGE_DEBUG: Value "MakeSchemas/Read.out" materialized.
root: INFO: 2019-09-16T14:22:28.992Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-09-16T14:22:29.002Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-09-16T14:22:29.029Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-09-16T14:22:29.052Z: JOB_MESSAGE_DEBUG: Value "MakeTables/Read.out" materialized.
root: INFO: 2019-09-16T14:22:29.076Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-09-16T14:22:29.106Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
root: INFO: 2019-09-16T14:22:29.139Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-09-16T14:22:29.177Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-09-16T14:22:29.214Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.243Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.250Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.275Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.290Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.326Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.330Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.361Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-09-16T14:22:29.365Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.396Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.412Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:587>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-09-16T14:22:29.424Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-09-16T14:22:29.444Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-09-16T14:22:29.470Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-09-16T14:22:29.506Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-09-16T14:22:29.538Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-09-16T14:22:29.574Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-09-16T14:22:55.013Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-16T14:24:34.621Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-16T14:24:34.656Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-09-16T14:28:25.572Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)
root: INFO: 2019-09-16T14:28:25.688Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
root: INFO: 2019-09-16T14:28:25.721Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
root: INFO: 2019-09-16T14:28:25.759Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>).out" materialized.
root: INFO: 2019-09-16T14:28:25.793Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:25.829Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:25.837Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:25.858Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.892Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:25.894Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.925Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.933Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.956Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.972Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-09-16T14:28:25.991Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:25.999Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-09-16T14:28:26.044Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:26.079Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:26.115Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:26.198Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
root: INFO: 2019-09-16T14:28:28.555Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-09-16T14:28:29.705Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-09-16T14:28:29.791Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
root: INFO: 2019-09-16T14:28:29.821Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:800>).out" materialized.
root: INFO: 2019-09-16T14:28:29.839Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
root: INFO: 2019-09-16T14:28:29.888Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:29.932Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:29.954Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:29.962Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:29.983Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:29.983Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:30.007Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0)
root: INFO: 2019-09-16T14:28:30.012Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:30.039Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:30.042Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:30.052Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
root: INFO: 2019-09-16T14:28:30.069Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:30.096Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda at bigquery_file_loads.py:800>).out.0).output" materialized.
root: INFO: 2019-09-16T14:28:30.123Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-09-16T14:28:30.146Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
root: INFO: 2019-09-16T14:28:30.167Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
root: INFO: 2019-09-16T14:28:39.495Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
root: INFO: 2019-09-16T14:28:39.545Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
root: INFO: 2019-09-16T14:28:39.598Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
root: INFO: 2019-09-16T14:28:39.658Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
root: INFO: 2019-09-16T14:28:39.710Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
root: INFO: 2019-09-16T14:28:39.908Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-09-16T14:28:39.977Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:587>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-09-16T14:28:40.423Z: JOB_MESSAGE_WARNING: S27:WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables failed.
root: INFO: 2019-09-16T14:28:40.577Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
root: INFO: 2019-09-16T14:28:40.826Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-16T14:28:41.342Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-09-16T14:28:41.362Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-16T14:32:01.999Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-09-16T14:32:02.045Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-16T14:32:02.089Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-09-16_07_22_20-16336826610223018234 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_file_loads_15686437286158 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_48-8749339800365309374?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_32_46-2280815557985039831?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_51-2736796879491962806?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_26_39-7142914979578726127?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_38_04-6255323627300976013?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_47_57-3468746998384741681?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_52-14980278764309259032?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_26_10-18432511344977094215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_36_38-17017584777775753669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_46_13-16853860671107211806?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_47-7466918130260941671?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_34_21-13006502374736719521?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_42_49-11079766573101423803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_52_59-11053913085481918187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_47-7852988544086919387?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_21_50-9877978110169647340?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_31_44-12691783233234054270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_41_45-5431271968050012499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_47-7779359445061860302?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_22_10-6766516818613557576?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_32_14-15371083153055766003?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_41_58-13475302097652406940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_51_56-15856662222938959965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_54-14719566280860400141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_22_03-1101871977675344906?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_26_38-11501851469865097263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_36_51-6717586576695555891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_46_55-11502447029591102617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_10_48-15944020605233098078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_22_20-16336826610223018234?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_32_24-17977440491459776624?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_42_39-3689428602513419400?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_07_51_46-7669288463863035787?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3120.362s

FAILED (SKIP=6, errors=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 53s
65 actionable tasks: 48 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/5jmcwchdekvfi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org