You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/09/16 23:37:42 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #484

See <https://builds.apache.org/job/beam_PostCommit_Python2/484/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8034] Upgrade Flink Runner to 1.8.2

------------------------------------------
[...truncated 368.20 KB...]
Requirement already satisfied: mock<3.0.0,>=1.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (2.0.0)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.9.0)
Requirement already satisfied: oauth2client<4,>=2.0.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.9.1)
Requirement already satisfied: pydot<2,>=1.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.4.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (2.8.0)
Requirement already satisfied: pytz>=2018.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (2019.2)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.13)
Requirement already satisfied: avro<2.0.0,>=1.8.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.9.1)
Requirement already satisfied: funcsigs<2,>=1.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.0.2)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.3.0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (0.6.8)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (3.6.6)
Requirement already satisfied: pyarrow<0.15.0,>=0.11.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (0.14.1)
Requirement already satisfied: nose>=1.3.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.3.7)
Requirement already satisfied: nose_xunitmp>=0.4.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (0.4.1)
Requirement already satisfied: numpy<2,>=1.14.3 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.16.5)
Requirement already satisfied: pandas<0.25,>=0.23.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (0.24.2)
Requirement already satisfied: parameterized<0.7.0,>=0.6.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (0.6.3)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (1.9.0)
Requirement already satisfied: tenacity<6.0,>=5.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from apache-beam==2.17.0.dev0) (5.1.1)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.12.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.17.0.dev0) (1.1.6)
Requirement already satisfied: docopt in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (0.6.2)
Requirement already satisfied: requests>=2.7.0 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (2.22.0)
Requirement already satisfied: pbr>=0.11 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from mock<3.0.0,>=1.0.1->apache-beam==2.17.0.dev0) (5.4.3)
Requirement already satisfied: pyasn1>=0.1.7 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0) (0.4.7)
Requirement already satisfied: pyasn1-modules>=0.0.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0) (0.2.6)
Requirement already satisfied: rsa>=3.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from oauth2client<4,>=2.0.1->apache-beam==2.17.0.dev0) (4.0)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.17.0.dev0) (41.2.0)
Requirement already satisfied: pyparsing>=2.1.4 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from pydot<2,>=1.2.0->apache-beam==2.17.0.dev0) (2.4.2)
Requirement already satisfied: monotonic>=0.6; python_version == "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from tenacity<6.0,>=5.0.2->apache-beam==2.17.0.dev0) (1.5)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (1.25.3)
Requirement already satisfied: certifi>=2017.4.17 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (2019.9.11)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages> (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.17.0.dev0) (2.8)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.17.0.dev0
    Not uninstalling apache-beam at <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python,> outside environment <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:root:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:68: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size)
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f08e04e3d70> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f08e04e3e60> ====================
INFO:root:==================== <function lift_combiners at 0x7f08e04e3ed8> ====================
INFO:root:==================== <function expand_sdf at 0x7f08e04e3f50> ====================
INFO:root:==================== <function expand_gbk at 0x7f08e084c050> ====================
INFO:root:==================== <function sink_flattens at 0x7f08e084c140> ====================
INFO:root:==================== <function greedily_fuse at 0x7f08e084c1b8> ====================
INFO:root:==================== <function read_to_impulse at 0x7f08e084c230> ====================
INFO:root:==================== <function impulse_to_input at 0x7f08e084c2a8> ====================
INFO:root:==================== <function inject_timer_pcollections at 0x7f08e084c410> ====================
INFO:root:==================== <function sort_stages at 0x7f08e084c488> ====================
INFO:root:==================== <function window_pcollection_coders at 0x7f08e084c500> ====================
INFO:root:Running (ref_AppliedPTransform_Create documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:root:Writing 100000 documents to mongodb finished in 52.489 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:root:Reading from mongodb beam_mongodbio_it_db:integration_test_1568672937
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:83: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f08e04e3d70> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f08e04e3e60> ====================
INFO:root:==================== <function lift_combiners at 0x7f08e04e3ed8> ====================
INFO:root:==================== <function expand_sdf at 0x7f08e04e3f50> ====================
INFO:root:==================== <function expand_gbk at 0x7f08e084c050> ====================
INFO:root:==================== <function sink_flattens at 0x7f08e084c140> ====================
INFO:root:==================== <function greedily_fuse at 0x7f08e084c1b8> ====================
INFO:root:==================== <function read_to_impulse at 0x7f08e084c230> ====================
INFO:root:==================== <function impulse_to_input at 0x7f08e084c2a8> ====================
INFO:root:==================== <function inject_timer_pcollections at 0x7f08e084c410> ====================
INFO:root:==================== <function sort_stages at 0x7f08e084c488> ====================
INFO:root:==================== <function window_pcollection_coders at 0x7f08e084c500> ====================
INFO:root:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:root:Read 100000 documents from mongodb finished in 21.634 seconds
mongoioit27344
mongoioit27344

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4496.793s

OK (SKIP=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_59-16246445915241981677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_43_38-2678509682039486589?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_52_17-14767791286482311384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_00_59-2429604725258420670?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_11_02-1382629777651794199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_19_59-17119405961032377816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_28_07-17584025037365039240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_23_02-12534378490026743252?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_38_25-14288939044988211129?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_49_55-7769900112871824466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_59_25-8786875239263659847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_23_01-16295571579175172750?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_37_22-4510903125639104557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_45_27-11862585428555291081?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_53_55-9527337352292821859?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_03_15-2218826180660300755?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_59-5008222236156218093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_44_08-13577284051123796323?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_54_09-15035748702588203005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_03_06-16470597843197434718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_59-14029594523599270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_33_42-11417150403926156588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_44_06-12419105826587348316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_52_50-10823423710349239622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_02_04-8608607886976878473?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_11_01-2418262871550913282?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_58-11365585909858925191?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_33_05-13808231867479261663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_44_22-16201314295531793661?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_53_47-17056171801893655400?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_03_21-15446245922605267688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_57-14780998950307375564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_33_19-839788250656618363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_44_52-17061330411811685880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_03_50-1384241089075920977?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_22_59-17429017876174345147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_33_42-17383065001491321448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_45_39-15730247017864605415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_15_54_52-15944242305284757880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_16_04_37-9395489665969093860?project=apache-beam-testing

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 131

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:crossLanguagePortableWordCount'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 105

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:crossLanguagePythonJavaFlink'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 55s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/en3zwpf3k7dq6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #487

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/487/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #486

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/486/display/redirect?page=changes>

Changes:

[hannahjiang] BEAM-8165 pass root as a param and keep default root

------------------------------------------
[...truncated 773.26 KB...]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (1/2) (a37cc84b21f623aca1ddd3a684050ffb) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-root-0917012829-b220c86b (96f4cf5fdef4527c705679d414c50664) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 96f4cf5fdef4527c705679d414c50664 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-root-0917012829-b220c86b(96f4cf5fdef4527c705679d414c50664).
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection af028c2559d274fe2c4b88ca5bfd71ec: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher3db5c2ec-1403-4ec0-a8ba-230181211de2.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher3db5c2ec-1403-4ec0-a8ba-230181211de2.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /tmp/flink-io-1d9e1f1e-279f-4922-9562-4646e32e4ff8
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 96f4cf5fdef4527c705679d414c50664 with leader id 869094acd3a0ebf1eaa60ffcfadd4721 lost leadership.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 869094acd3a0ebf1eaa60ffcfadd4721@akka://flink/user/jobmanager_1 for job 96f4cf5fdef4527c705679d414c50664 from the resource manager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher3db5c2ec-1403-4ec0-a8ba-230181211de2.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:39967
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 14194 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=pcollection}: 5, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 12, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 19, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}: 0, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_17:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6, ref_PCollection_PCollection_27:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/pcollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 4, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 6, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/pcollection:0}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 3, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 12, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0, ref_PCollection_PCollection_12:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 3, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 6, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2466>)_26}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}: 4, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 19, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 3, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 3, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 6, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0, ref_PCollection_PCollection_14:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 3, ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 12, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 12, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 0, ref_PCollection_PCollection_9:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 12, ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0, ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}: 0)Distributions(ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54, count=3, min=18, max=18}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51, count=3, min=17, max=17}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45, count=3, min=15, max=15}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19, count=1, min=19, max=19}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72, count=3, min=24, max=24}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16, count=1, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192, count=12, min=16, max=16}, ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168, count=12, min=14, max=14}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58, count=1, min=58, max=58}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41, count=1, min=41, max=41}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63, count=3, min=21, max=21}, ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33, count=1, min=33, max=33}, ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180, count=12, min=15, max=15}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57, count=3, min=19, max=19}, ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54, count=3, min=18, max=18}))
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/beam-artifact-staging/job_b5c807c9-9be3-4005-ac0b-592988a35854/MANIFEST has 0 artifact locations
[flink-runner-job-invoker] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Removed dir /tmp/beam-artifact-staging/job_b5c807c9-9be3-4005-ac0b-592988a35854/

> Task :sdks:python:test-suites:portable:py2:crossLanguageTests

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

======================================================================
ERROR: test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/fastavro_it_test.py",> line 160, in test_avro_it
    result = self.test_pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 485, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 531, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 561, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 491, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python2.7/subprocess.py", line 574, in check_output
    raise CalledProcessError(retcode, cmd, output=output)
CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1
 
 Pip install failed for package: -r         
 Output from execution of subprocess: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
  File was already downloaded /tmp/dataflow-requirements-cache/mock-3.0.5.tar.gz
Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  File was already downloaded /tmp/dataflow-requirements-cache/setuptools-41.2.0.zip
Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1))
  ERROR: Could not find a version that satisfies the requirement six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) (from versions: none)
ERROR: No matching distribution found for six (from pyhamcrest->-r postcommit_requirements.txt (line 1))

-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones
root: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0917012221-591691.1568683341.591852/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0917012221-591691.1568683341.591852/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0917012221-591691.1568683341.591852/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0917012221-591691.1568683341.591852/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4052.163s

FAILED (SKIP=4, errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_35-17829890926190080156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_35_59-11832789368282271702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_44_25-1805171047418526138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_53_22-11066246533483103947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_02_49-2104476076245819972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_12_20-2649308755980958507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_21_22-8327087225486873603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_36-9054250459735493135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_40_05-18338994757017687270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_48_39-2373963419727347376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_56_57-855094477822778133?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_05_23-18068140362139263677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_30-2023270903007295432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_43_25-13132721070667919156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_52_08-16557760176834571670?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_00_23-18139547431722196664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_30-13130963392261874452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_41_39-6523547117669305091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_51_09-16446490297694498585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_59_52-17429715387979492048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_30-17955043354545552170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_31_52-616027286753188906?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_42_06-11362573458411848107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_19_01_30-13025146821001582938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_29-5889451380367740823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_30_56-11716087986564849117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_40_40-527318802833885812?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_50_13-9266751178813095184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_59_55-18394790682923393309?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_30-11826355582174434153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_33_02-11765132676293439556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_41_15-398680248148963691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_50_28-11505447457161170540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_59_52-8389984992172488092?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_22_40-15491401024585361621?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_33_24-14706343749994015345?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_44_34-13337320873814361968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_54_29-6085905020591973420?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 26s
111 actionable tasks: 88 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/lm7sbe5vgukdu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #485

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/485/display/redirect>

------------------------------------------
[...truncated 989.87 KB...]
 location: u'us-central1'
 name: u'beamapp-jenkins-0917003931-646768'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-09-17T00:39:42.750010Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-09-16_17_39_41-17904310761893006702]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_39_41-17904310761893006702?project=apache-beam-testing
root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 4.64586512947 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:python_write_to_table_15686807706764.python_new_types_table was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 6.32289797955 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:python_write_to_table_15686807706764.python_new_types_table was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 10.2232994394 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:python_write_to_table_15686807706764.python_new_types_table was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 32.1563000765 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:python_write_to_table_15686807706764.python_new_types_table was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 54.5213802884 seconds before retrying _query_with_retry because we caught exception: NotFound: 404 Not found: Table apache-beam-testing:python_write_to_table_15686807706764.python_new_types_table was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 206, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py",> line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15686807706764.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4564.447s

FAILED (SKIP=4, errors=4)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_58-4970227433758130871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_13_38-6903030776584891948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_24_12-16846249547540449759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_34_10-1229588601696064029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_42_38-711667146310343761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_51_37-9062148041945995465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_01_20-14267650343979438897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_18_10_09-12358461334177874648?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_04_03-3228228103757865063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_19_18-9968141881553772011?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_28_00-3860237214404912933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_37_12-4798275915863531687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_46_36-18360637365269121255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_58-15271479433367874293?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_24_52-699878014922392123?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_36_18-14068993934062860262?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_46_08-1575375982835580108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_04_02-7757616319883379086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_19_37-7373589457878488664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_30_14-10213171026935581125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_40_36-9065984611164970416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_57-11387778221230426797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_24_32-12837474446689499104?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_34_57-5640435997133845915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_43_51-786832837786728741?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_52_14-16810671079959148043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_56-8454956638853864981?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_14_05-10549206350080619740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_24_39-15876502845752474173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_36_15-13965036172826655433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_45_00-12790932814085977425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_58-18159575237483385620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_15_32-9159045564709665863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_27_51-11545483023813168443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_39_41-17904310761893006702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_42_09-17063997079348512057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_03_57-16693416078730489784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_15_36-864265901801166044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_27_24-15521096620156970668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-16_17_46_53-135822764174154711?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 131

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:crossLanguagePortableWordCount'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 105

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:crossLanguagePythonJavaFlink'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 1s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/p5vuwnvrdm3ja

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org