You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/04/02 01:26:59 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #1046

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1046/display/redirect?page=changes>

Changes:

[rohde.samuel] Update Dataflow V1Beta3 to newest version to add the

[rohde.samuel] Add the 'enable_hot_key_logging' PipelineOption and plumb it to the

[lpost] [BEAM-12079] Deterministic coding enforcement causes

[noreply] fix: variables names in test-stream.md

[Udi Meiri] Update dev Dataflow containers to latest version.

[Kyle Weaver] [BEAM-12033] Create ZetaSqlException, which contains a GRPC Status code.

[noreply] Merge pull request #14372 from [BEAM-12014] Add BIGNUMERIC support in


------------------------------------------
[...truncated 40.14 MB...]
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:38.664Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:39.391Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:39.889Z: JOB_MESSAGE_DEBUG: Executing failure step failure48
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:40.077Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-040200435-04011744-f06y-harness-bvnc
      Root cause: Work item failed.,
  beamapp-jenkins-040200435-04011744-f06y-harness-bvnc
      Root cause: Work item failed.,
  beamapp-jenkins-040200435-04011744-f06y-harness-bvnc
      Root cause: Work item failed.,
  beamapp-jenkins-040200435-04011744-f06y-harness-bvnc
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:40.401Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:40.474Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:51:40.514Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:52:24.704Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:52:24.748Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T00:52:24.786Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-04-01_17_44_02-13246612188569405916 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset python_bq_streaming_inserts_16173242282899 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_45-17499911106935513691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_18_22-6111063415707104243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_26_36-17327399244369260471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_34_48-12739655610141691080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_43_34-8904825658856291089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_52_05-17663820542570565868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_00_50-7910866993548419496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_10_04-9541438453981390952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_44-12666063734752426789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_25_09-9358276951089896496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_34_10-15095321732487843045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_44_02-13246612188569405916?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_53_39-8169603264695623614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_01_19-14121265306356075688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_10_29-11813810985893855594?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_18_26-12975671982474253829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_43-3481809735560369729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_16_19-4146306868750046917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_26_27-11655956157150394654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_36_46-16532005754299127761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_46_27-10239772575795602718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_03_12-6103011993953709379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_37-10196174445192590071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_24_48-260043316342306826?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_35_27-4344886509974489191?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_45_43-17600804804208880445?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_55_21-761759604198406612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_05_01-6211927963546889031?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_14_33-15586965309424652698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_06_49-2849046091287359575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_16_24-4490476733231249062?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_25_16-10937490425694747470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_33_33-9130949177186503729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_42_13-4434591154204181687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_51_20-14217975704553693486?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_00_11-6287410410108792747?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_08_35-2107570108767478992?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_40-11923440318096290666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_15_06-3036661315574918107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_25_08-5400835150284889802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_36_38-15516523646359458818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_46_04-12916830124750699196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_55_31-4493096053694879498?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_05_30-2231336251416920402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_14_38-5677104774130447032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_45-8796887955952029692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_14_11-6823945865749004560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_24_28-31274945521125562?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_36_26-6497033412415571447?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_45_12-13578391856526670800?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_54_08-11949829486094318114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_02_58-6574231160160435072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_12_11-18051810939494385200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_04_39-2714590777239726958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_14_26-17562351703165780310?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_23_41-9544847414413479870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_31_44-9917211290700106218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_39_58-16539915611439943901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_48_16-542683157434030362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_17_57_55-15994789226180786108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_07_24-9871979369858046350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-01_18_17_22-8824302216191922078?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4968.423s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 26m 30s
206 actionable tasks: 150 executed, 52 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/f7yw2fcpocj74

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #1051

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1051/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1050

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1050/display/redirect?page=changes>

Changes:

[Ismaël Mejía] [BEAM-11213] Display Beam Metrics in Spark History Server for Classic

[Boyuan Zhang] Re-enable CrossLanguageKafkaIOTest

[noreply] [BEAM-11747] Narrow list of unsupported types in BeamJavaUdfCalcRule.

[noreply] [BEAM-12079] Enforce callable destination arg type for WriteToBigQuery

[Ismaël Mejía] [BEAM-12088] Make file staging uniform among Spark Runners


------------------------------------------
[...truncated 41.89 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:28.256Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:28.362Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:28.458Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:37.541Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:37.613Z: JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:37.703Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:37.758Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:20:37.794Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:21:35.579Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:21:35.615Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:21:35.650Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-02_18_13_48-5408985349177326597 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.201Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/LoadJobNamePrefix+write/BigQueryBatchFileLoads/SchemaModJobNamePrefix+write/BigQueryBatchFileLoads/CopyJobNamePrefix+write/BigQueryBatchFileLoads/GenerateFilePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.267Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.376Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/SchemaModJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.399Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.435Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.466Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.495Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.530Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.538Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.557Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.571Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.600Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.627Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.650Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.661Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.693Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.694Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.704Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.732Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.788Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.836Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.867Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.897Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:06.966Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:08.768Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:08.920Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:09.028Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:09.376Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:09.469Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:09.634Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:26.793Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:26.870Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:26.928Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:27.027Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:30.148Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:30.251Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:30.323Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:30.423Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.673Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.741Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.788Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.854Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.880Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.031Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.171Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.343Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.426Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.912Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.954Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.961Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:42.995Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.029Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.058Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.095Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.102Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.147Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.186Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.232Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:43.283Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:51.434Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:51.497Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:51.579Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:51.628Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:51.657Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:52.924Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.016Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.047Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.126Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.183Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.282Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:53.391Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:24:55.924Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:02.980Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:03.066Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:03.142Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:03.215Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:03.280Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:03.368Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:06.976Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:07.093Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:07.223Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:07.309Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:07.483Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:07.641Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.215Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.322Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.451Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.569Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.682Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:10.790Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:11.073Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:11.205Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:11.308Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:13.111Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:13.215Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:13.337Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:13.477Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:14.025Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:14.150Z: JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:14.368Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:14.517Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:14.553Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:55.332Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:55.405Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:25:55.439Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-02_18_17_19-2095158255481111289 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:26:06.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:26:06.594Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-03T01:26:06.650Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-04-02_18_16_23-14671222624691579709 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16174125693200.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/e45d6814-cd5c-4833-852d-8c574166261c?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/e45d6814-cd5c-4833-852d-8c574166261c?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16174125693200 in project apache-beam-testing
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4976.813s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 144

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 25m 50s
208 actionable tasks: 151 executed, 53 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/omeekojqek2h6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1049

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1049/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7372] delete codes for compatibility of py2 from apache_beam/io


------------------------------------------
[...truncated 39.76 MB...]
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:54.355Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.639Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.667Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.738Z: JOB_MESSAGE_DEBUG: Executing failure step failure48
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.767Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-040218423-04021142-9roh-harness-6ll2
      Root cause: Work item failed.,
  beamapp-jenkins-040218423-04021142-9roh-harness-6ll2
      Root cause: Work item failed.,
  beamapp-jenkins-040218423-04021142-9roh-harness-6ll2
      Root cause: Work item failed.,
  beamapp-jenkins-040218423-04021142-9roh-harness-6ll2
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.860Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:57.928Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:58.011Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:49:58.048Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:50:56.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:50:56.336Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T18:50:56.370Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-04-02_11_42_47-1466060333379360437 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset python_bq_streaming_inserts_16173889532194 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4978.028s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 25m 54s
206 actionable tasks: 147 executed, 55 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/rl43usn3hccyk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1048

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1048/display/redirect?page=changes>

Changes:

[noreply] Support multilayer ZetaSQL UNNEST (#14342)


------------------------------------------
[...truncated 40.18 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:08.883Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.180Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.306Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.362Z: JOB_MESSAGE_DEBUG: Executing failure step failure48
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.384Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-040212413-04020541-306n-harness-cfql
      Root cause: Work item failed.,
  beamapp-jenkins-040212413-04020541-306n-harness-cfql
      Root cause: Work item failed.,
  beamapp-jenkins-040212413-04020541-306n-harness-cfql
      Root cause: Work item failed.,
  beamapp-jenkins-040212413-04020541-306n-harness-cfql
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.478Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.551Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:49:12.580Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:50:05.750Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:50:05.791Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T12:50:05.819Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-04-02_05_41_39-5882953127623678889 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset python_bq_streaming_inserts_16173672879049 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4888.125s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 9s
206 actionable tasks: 149 executed, 53 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/u4tx2smuixrcq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #1047

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/1047/display/redirect>

Changes:


------------------------------------------
[...truncated 39.65 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:01.708Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:04.968Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1300, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 581, in apache_beam.runners.common.SimpleInvoker.invoke_process
  File "apache_beam/runners/common.py", line 1395, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 220, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1249, in process
    schema = self.schema(destination, *schema_side_inputs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1247, in <lambda>
    table_map: table_map.get(dest, None),
TypeError: unhashable type: 'TableReference' [while running 'WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:04.994Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:05.052Z: JOB_MESSAGE_DEBUG: Executing failure step failure48
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:05.086Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/DropShard+WriteWithMultipleDests/_StreamToBigQuery/FromHashableTableRef+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-040206430-04012343-bn5f-harness-fj0m
      Root cause: Work item failed.,
  beamapp-jenkins-040206430-04012343-bn5f-harness-fj0m
      Root cause: Work item failed.,
  beamapp-jenkins-040206430-04012343-bn5f-harness-fj0m
      Root cause: Work item failed.,
  beamapp-jenkins-040206430-04012343-bn5f-harness-fj0m
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:05.174Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:05.235Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:05.280Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:54.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:54.452Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-02T06:51:54.483Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-04-01_23_43_13-18009954575301064769 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset python_bq_streaming_inserts_16173457815285 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4900.086s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 19s
206 actionable tasks: 147 executed, 55 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/jqsycrsfqosem

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org