You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/10/26 06:53:45 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #820

See <https://builds.apache.org/job/beam_PostCommit_Python35/820/display/redirect>

Changes:


------------------------------------------
[...truncated 88.78 KB...]
root: INFO: ==================== <function expand_sdf at 0x7fce4c96a268> ====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_gbk at 0x7fce4c96a2f0> ====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n  must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sink_flattens at 0x7fce4c96a400> ====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n  write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n  must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n  write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n  write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function greedily_fuse at 0x7fce4c96a488> ====================
root: DEBUG: 2 [4, 6]
root: DEBUG: Stages: ['((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  create/Read:beam:transform:read:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function read_to_impulse at 0x7fce4c96a510> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function impulse_to_input at 0x7fce4c96a598> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function inject_timer_pcollections at 0x7fce4c96a730> ====================
root: DEBUG: 2 [4, 7]
root: DEBUG: Stages: ['((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ', '((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sort_stages at 0x7fce4c96a7b8> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: ['((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ', '((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ']
root: INFO: ==================== <function window_pcollection_coders at 0x7fce4c96a840> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: ['((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n  must follow: \n  downstream_side_inputs: ', '((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n  write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must follow: ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n  downstream_side_inputs: ']
root: INFO: Creating state cache with size 100
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fce4c60dac8> for environment urn: "beam:env:embedded_python:v1"

root: INFO: Running ((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))
root: DEBUG: start <DataOutputOperation >
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
root: DEBUG: start <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DataOutputOperation >
root: DEBUG: Wait for the bundle bundle_16 to finish.
root: INFO: Running ((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17)))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)
root: DEBUG: start <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_write_to_table_15720696942043'
 projectId: 'apache-beam-testing'
 tableId: 'python_no_schema_table'> with schema None.
root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: Attempting to flush to all destinations. Total buffered: 4
root: DEBUG: Flushing data to apache-beam-testing:python_write_to_table_15720696942043.python_no_schema_table. Total 4 rows.
root: DEBUG: Passed: True. Errors are []
root: DEBUG: Wait for the bundle bundle_17 to finish.
root: INFO: Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_15720696942043.python_no_schema_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/468e46e6-94f1-4d0f-9a00-b449e2e5e86a?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/468e46e6-94f1-4d0f-9a00-b449e2e5e86a?location=US HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon8a380a0a0dea0708926b9b424464ff0bbba86a10/data HTTP/1.1" 200 None
root: INFO: Result of query is: []
root: INFO: Deleting dataset python_write_to_table_15720696942043 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 28.705s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_02_04-13491141757975704929?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_17_04-13874839183051145505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_24_31-16259160994499414936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_32_26-8624003403318165237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_40_41-16693588997527392079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_01_56-3247302598001986641?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_22_07-15224077066428302688?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_02_12-6976570259295125252?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_15_33-10997290173351834522?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_24_57-2880224800488568180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_32_54-18160445877434824197?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_41_09-15404588811339384995?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_01_56-2283406156443713032?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_20_10-5430011030460090929?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_28_59-2436867199972140387?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_38_05-14168315154482088567?project=apache-beam-testing
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_01_58-12886224763638647807?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_11_08-15204766747645886876?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_19_37-13547599648745645227?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_27_39-16649292936957696021?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_35_42-6964798251772916819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_01_55-12198481824057288995?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_10_25-17177831776248086376?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_19_13-9342425760876174378?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_28_23-13428300352816295997?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_36_39-11603161937996027156?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_02_08-12032801665490046610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_11_20-7437851307540817468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_21_02-13177149546303255988?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_29_16-14652249425244785039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_39_11-12658900202602367606?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_01_56-5635053895793938499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_10_48-119487028088981758?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_18_54-17683420466278010297?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_27_49-15334966174076543216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_36_50-14158236521867890095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_23_45_20-9746905431940919896?project=apache-beam-testing
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3145.728s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 21s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/z2utpsfs3xdre

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #822

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/822/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #821

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/821/display/redirect>

Changes:


------------------------------------------
[...truncated 127.75 KB...]
        ],
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input910d66c2-3a7f-49e6-b0c6-2ffaeb82d903",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "StreamingUserMetricsDoFn",
            "type": "STRING",
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "generate_metrics.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4",
        "user_name": "generate_metrics"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output910d66c2-3a7f-49e6-b0c6-2ffaeb82d903",
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-10-26T12:40:23.296133Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-10-26_05_40_21-7135007266496900249'
 location: 'us-central1'
 name: 'beamapp-jenkins-1026123954-428558'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-10-26T12:40:23.296133Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-10-26_05_40_21-7135007266496900249]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_40_21-7135007266496900249?project=apache-beam-testing
root: INFO: Job 2019-10-26_05_40_21-7135007266496900249 is in state JOB_STATE_RUNNING
root: INFO: 2019-10-26T12:40:25.644Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-26T12:40:26.386Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-10-26T12:40:26.951Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-10-26T12:40:26.954Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-10-26T12:40:26.964Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-10-26T12:40:26.974Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-10-26T12:40:26.977Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-10-26T12:40:26.979Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-10-26T12:40:26.990Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-10-26T12:40:26.993Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
root: INFO: 2019-10-26T12:40:26.995Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
root: INFO: 2019-10-26T12:40:27.003Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-10-26T12:40:27.041Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-10-26T12:40:27.053Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-10-26T12:40:27.201Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-10-26T12:40:27.219Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-10-26T12:40:27.224Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-10-26T12:40:30.740Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+generate_metrics+dump_to_pub/Write/NativeWrite
root: INFO: 2019-10-26T12:41:01.420Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-10-26T12:41:01.422Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
root: INFO: 2019-10-26T12:41:02.024Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-10-26T12:41:10.220Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
root: INFO: 2019-10-26T12:41:14.290Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: WARNING: Timing out on waiting for job 2019-10-26_05_40_21-7135007266496900249 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_40-12354694637951064146?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_17_27-4339932536917275113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_25_49-17253990054432103861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_33_35-8092028869836478870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_41_29-14574755809121345963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_35-13956187588985925245?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_22_14-5810706485031213614?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_31_19-8823131808052950585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_40_21-7135007266496900249?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_38-1014693561578851452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_15_25-12633505599895932618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_23_39-15234823243622822639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_31_55-16605467248558769065?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_40_43-10675363004077992794?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_35-13558652327962546213?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_23_16-3586342555894872043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_34-4912700314884588461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_11_34-3885809311299320938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_19_28-12597181213063077276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_28_08-9857168529448024656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_36_57-15404969837545169000?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_45_27-4387314381145859007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_34-11883939835436908501?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_10_54-15207070455223126996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_19_24-7891879294255604297?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_29_14-3526189801820265936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_37_04-14824414472643854937?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_36-290143715678763945?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_12_01-3028409722609881794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_21_48-11220581067162495185?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_31_07-14367433719812194539?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_40_37-4429872594313276243?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_02_40-1738533677742537986?project=apache-beam-testing
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_11_37-4446741436591760973?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_19_29-15019589233920161107?project=apache-beam-testing
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_27_07-6282784204498806257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-26_05_36_18-11686880207140288305?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:707: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3123.141s

FAILED (SKIP=6, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 13s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/g3u2psjmts5zw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org