You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/10/22 00:52:09 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #747

See <https://builds.apache.org/job/beam_PostCommit_Python37/747/display/redirect>

Changes:


------------------------------------------
[...truncated 241.82 KB...]
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... FAIL
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok

======================================================================
FAIL: test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_it_test.py",> line 229, in test_big_query_write_new_types
    write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY))
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 427, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",> line 51, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Expected data is [(0.33, Decimal('10'), b'\xab\xac', datetime.date(3000, 12, 31), datetime.time(23, 59, 59), datetime.datetime(2018, 12, 31, 12, 44, 31), datetime.datetime(2018, 12, 31, 12, 44, 31, 744957, tzinfo=<UTC>), 'POINT(30 10)'), (0.33, None, None, None, None, None, None, None), (None, Decimal('10'), None, None, None, None, None, None), (None, None, b'\xab\xac', None, None, None, None, None), (None, None, None, datetime.date(3000, 12, 31), None, None, None, None), (None, None, None, None, datetime.time(23, 59, 59), None, None, None), (None, None, None, None, None, datetime.datetime(2018, 12, 31, 12, 44, 31), None, None), (None, None, None, None, None, None, datetime.datetime(2018, 12, 31, 12, 44, 31, 744957, tzinfo=<UTC>), None), (None, None, None, None, None, None, None, 'POINT(30 10)')])
     but: Expected data is [(0.33, Decimal('10'), b'\xab\xac', datetime.date(3000, 12, 31), datetime.time(23, 59, 59), datetime.datetime(2018, 12, 31, 12, 44, 31), datetime.datetime(2018, 12, 31, 12, 44, 31, 744957, tzinfo=<UTC>), 'POINT(30 10)'), (0.33, None, None, None, None, None, None, None), (None, Decimal('10'), None, None, None, None, None, None), (None, None, b'\xab\xac', None, None, None, None, None), (None, None, None, datetime.date(3000, 12, 31), None, None, None, None), (None, None, None, None, datetime.time(23, 59, 59), None, None, None), (None, None, None, None, None, datetime.datetime(2018, 12, 31, 12, 44, 31), None, None), (None, None, None, None, None, None, datetime.datetime(2018, 12, 31, 12, 44, 31, 744957, tzinfo=<UTC>), None), (None, None, None, None, None, None, None, 'POINT(30 10)')] Actual data is []

-------------------- >> begin captured logging << --------------------
root: INFO: Created dataset python_write_to_table_15717029578268 in project apache-beam-testing
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Any
root: DEBUG: Unhandled type_constraint: Any
root: INFO: ==================== <function annotate_downstream_side_inputs at 0x7fcc128bd9d8> ====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function fix_side_input_pcoll_coders at 0x7fcc128bdae8> ====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function lift_combiners at 0x7fcc128bdb70> ====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_sdf at 0x7fcc128bdbf8> ====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_gbk at 0x7fcc128bdc80> ====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n  must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sink_flattens at 0x7fcc128bdd90> ====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n  must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function greedily_fuse at 0x7fcc128bde18> ====================
root: DEBUG: 2 [4, 6]
root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  create/Read:beam:transform:read:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function read_to_impulse at 0x7fcc128bdea0> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function impulse_to_input at 0x7fcc128bdf28> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function inject_timer_pcollections at 0x7fcc128c1158> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sort_stages at 0x7fcc128c11e0> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: ['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ', '(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ']
root: INFO: ==================== <function window_pcollection_coders at 0x7fcc128c1268> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: ['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ', '(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n  downstream_side_inputs: ']
root: INFO: Creating state cache with size 100
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fcc1268b828> for environment urn: "beam:env:embedded_python:v1"

root: INFO: Running (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)
root: DEBUG: start <DataOutputOperation >
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
root: DEBUG: start <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DataOutputOperation >
root: DEBUG: Wait for the bundle bundle_3 to finish.
root: INFO: Running (((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)
root: DEBUG: start <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_write_to_table_15717029578268'
 projectId: 'apache-beam-testing'
 tableId: 'python_new_types_table'> with schema {'fields': [{'name': 'float', 'type': 'FLOAT'}, {'name': 'numeric', 'type': 'NUMERIC'}, {'name': 'bytes', 'type': 'BYTES'}, {'name': 'date', 'type': 'DATE'}, {'name': 'time', 'type': 'TIME'}, {'name': 'datetime', 'type': 'DATETIME'}, {'name': 'timestamp', 'type': 'TIMESTAMP'}, {'name': 'geo', 'type': 'GEOGRAPHY'}]}.
root: DEBUG: Created the table with id python_new_types_table
root: INFO: Created table apache-beam-testing.python_write_to_table_15717029578268.python_new_types_table with schema <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'float'
 type: 'FLOAT'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'numeric'
 type: 'NUMERIC'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'bytes'
 type: 'BYTES'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'date'
 type: 'DATE'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'time'
 type: 'TIME'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'datetime'
 type: 'DATETIME'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'timestamp'
 type: 'TIMESTAMP'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'geo'
 type: 'GEOGRAPHY'>]>. Result: <Table
 creationTime: 1571702958293
 etag: 'MS4nHqa0ZozaoLeZ9Uxq2g=='
 id: 'apache-beam-testing:python_write_to_table_15717029578268.python_new_types_table'
 kind: 'bigquery#table'
 lastModifiedTime: 1571702958426
 location: 'US'
 numBytes: 0
 numLongTermBytes: 0
 numRows: 0
 schema: <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'float'
 type: 'FLOAT'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'numeric'
 type: 'NUMERIC'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'bytes'
 type: 'BYTES'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'date'
 type: 'DATE'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'time'
 type: 'TIME'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'datetime'
 type: 'DATETIME'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'timestamp'
 type: 'TIMESTAMP'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'geo'
 type: 'GEOGRAPHY'>]>
 selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_write_to_table_15717029578268/tables/python_new_types_table'
 tableReference: <TableReference
 datasetId: 'python_write_to_table_15717029578268'
 projectId: 'apache-beam-testing'
 tableId: 'python_new_types_table'>
 type: 'TABLE'>.
root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: Attempting to flush to all destinations. Total buffered: 9
root: DEBUG: Flushing data to apache-beam-testing:python_write_to_table_15717029578268.python_new_types_table. Total 9 rows.
root: DEBUG: Passed: True. Errors are []
root: DEBUG: Wait for the bundle bundle_4 to finish.
root: INFO: Attempting to perform query SELECT float, numeric, bytes, date, time, datetime,timestamp, geo FROM python_write_to_table_15717029578268.python_new_types_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/aab8d634-2511-4c0f-b4b8-4654f8833d88?maxResults=0&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon320a0329a185205ffd31b4dd4d0abbbb4730c357/data HTTP/1.1" 200 None
root: INFO: Result of query is: []
root: INFO: Deleting dataset python_write_to_table_15717029578268 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 28.310s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED
Daemon will be stopped at the end of the build after the daemon was no longer found in the daemon registry
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org