You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/11/06 01:23:36 UTC

Build failed in Jenkins: beam_PostCommit_Python38 #491

See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/491/display/redirect?page=changes>

Changes:

[Pablo Estrada] Updating BigQuery client for Python

[Andrew Pilloud] [BEAM-11165] ZetaSQL Calc only convert referenced columns

[Robin Qiu] Support read/write ZetaSQL DATETIME/NUMERIC types from/to BigQuery

[Robin Qiu] Address comments

[noreply] Merge pull request #13164 from Refactoring BigQuery Read utilities into

[Robert Burke] Moving to 2.27.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 24.54 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.947Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.968Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:15.971Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.007Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.023Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.044Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.098Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:07:16.149Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:51.451Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'> [while running 'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:56.708Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'> [while running 'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:14:57.588Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'> [while running 'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:01.517Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 157, in apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 605, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 516, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_798_df6e2f109dea5595da17626a90a21b91_757d9a4504d5448d95779055a0e67c8f failed. Error Result: <ErrorProto
 message: 'An internal error occurred and the request could not be completed. Error: 1842444'
 reason: 'internalError'> [while running 'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.180Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.259Z: JOB_MESSAGE_DEBUG: Executing failure step failure47
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.298Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.,
  beamapp-jenkins-110601002-11051700-zkpn-harness-rxxx
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.378Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.638Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:02.669Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.139Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.206Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T01:15:44.243Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-11-05_17_00_36-3082242840281892995 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 67 tests in 4469.211s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 144

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 23m 10s
205 actionable tasks: 168 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wlixxrxrbzto6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python38 #494

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/494/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #493

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/493/display/redirect>

Changes:


------------------------------------------
[...truncated 24.47 MB...]
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "ReadFromPubSub/Read.out"
          }
        ],
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_inputd65d11f3-b1ad-4610-9ed4-53de0e590918",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s3",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "bytes_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_6",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s4",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s3"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputd65d11f3-b1ad-4610-9ed4-53de0e590918",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-11-06T12:47:39.548046Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-11-06_04_47_38-5830370542836908374'
 location: 'us-central1'
 name: 'beamapp-jenkins-1106124731-671382'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-11-06T12:47:39.548046Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-11-06_04_47_38-5830370542836908374]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2020-11-06_04_47_38-5830370542836908374
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-06_04_47_38-5830370542836908374?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-11-06_04_47_38-5830370542836908374 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:42.200Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:43.943Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:43.978Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.035Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.245Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.280Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.336Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.382Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.411Z: JOB_MESSAGE_DETAILED: Fusing consumer modify_data into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.443Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into modify_data
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.483Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.510Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_inputd65d11f3-b1ad-4610-9ed4-53de0e590918 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.540Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.575Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.606Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.638Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:44.769Z: JOB_MESSAGE_DEBUG: Executing wait step start19
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:46.830Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_inputd65d11f3-b1ad-4610-9ed4-53de0e590918'.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:46.891Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+modify_data+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:46.937Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:46.963Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:47:53.111Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:50:13.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:50:44.223Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-11-06T12:50:44.252Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-11-06_04_47_38-5830370542836908374 after 183 seconds
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 241
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_outputd65d11f3-b1ad-4610-9ed4-53de0e590918.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 67 tests in 4445.238s

FAILED (SKIP=7, failures=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 1s
205 actionable tasks: 161 executed, 40 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/nvg34wslbktjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python38 #492

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/492/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-11165] Use the ZetaSQL Streaming API synchronously

[noreply] [BEAM-11159] Use GCP pubsub client for TestPubsub (#13273)


------------------------------------------
[...truncated 24.41 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:16:00.054Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:16:00.089Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:15:58.623Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:15:58.669Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:15:58.696Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-05_23_08_19-8042772758543956357 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:16:53.768Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:16:53.835Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:16:53.871Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-05_23_09_36-18375506980931860500 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_1604646562929.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/583c504e-0288-46a6-b0d9-329fd55bfbed?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon4caab8f5681946e961ddc08864c087d040b806aa/data?prettyPrint=false HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_1604646562929.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:19:50.645Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:19:50.718Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:19:50.767Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:19:50.861Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:00.002Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:00.069Z: JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:00.144Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:00.191Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:00.213Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:43.826Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:43.895Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-06T07:20:43.932Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-05_23_13_31-9634718213445658667 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ERROR
test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok
test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok
test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok

======================================================================
ERROR: test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1187, in test_streaming_inserts
    self._run_pubsub_bq_pipeline(WriteToBigQuery.Method.STREAMING_INSERTS)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 1179, in _run_pubsub_bq_pipeline
    _ = mesages | WriteToBigQuery(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 568, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 514, in run
    return Pipeline.from_runner_api(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 56, in run_pipeline
    self.result = super(TestDataflowRunner,
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 526, in run_pipeline
    artifacts=environments.python_sdk_dependencies(options)))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/transforms/environments.py",> line 738, in python_sdk_dependencies
    staged_name in stager.Stager.create_job_resources(
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 176, in create_job_resources
    (
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 569, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/processes.py",> line 96, in check_output
    raise RuntimeError( \
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.8/subprocess.py", line 411, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/lib/python3.8/subprocess.py", line 512, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1.
 
 Pip install failed for package: -r           
 Output from execution of subprocess: b'Collecting pyhamcrest!=1.10.0,<2.0.0\n  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz\nCollecting mock<3.0.0\n  File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz\nCollecting parameterized<0.8.0,>=0.7.1\n  File was already downloaded /tmp/dataflow-requirements-cache/parameterized-0.7.4.tar.gz\nCollecting six\n  File was already downloaded /tmp/dataflow-requirements-cache/six-1.15.0.tar.gz\n    ERROR: Command errored out with exit status 1:\n     command: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python> -c \'import sys, setuptools, tokenize; sys.argv[0] = \'"\'"\'/tmp/pip-download-fp6zf1cx/six/setup.py\'"\'"\'; __file__=\'"\'"\'/tmp/pip-download-fp6zf1cx/six/setup.py\'"\'"\';f=getattr(tokenize, \'"\'"\'open\'"\'"\', open)(__file__);code=f.read().replace(\'"\'"\'\\r\\n\'"\'"\', \'"\'"\'\\n\'"\'"\');f.close();exec(compile(code, __file__, \'"\'"\'exec\'"\'"\'))\' egg_info --egg-base /tmp/pip-pip-egg-info-l8w3nyve\n         cwd: /tmp/pip-download-fp6zf1cx/six/\n    Complete output (5 lines):\n    Traceback (most recent call last):\n      File "<string>", line 1, in <module>\n      File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/tokenize.py",> line 392, in open\n        buffer = _builtin_open(filename, \'rb\')\n    FileNotFoundError: [Errno 2] No such file or directory: \'/tmp/pip-download-fp6zf1cx/six/setup.py\'\n    ----------------------------------------\nERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.\n'
-------------------- >> begin captured logging << --------------------
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 241
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 241
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/datasets?prettyPrint=false HTTP/1.1" 200 None
google.cloud.pubsub_v1.publisher.client: DEBUG: Commit thread is waking up
root: DEBUG: Unhandled type_constraint: Union[]
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: gRPC Publish took 0.27463722229003906 seconds.
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_16046452782436?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 67 tests in 4496.151s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 34s
205 actionable tasks: 179 executed, 22 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xgtjy2jz46mhy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org