You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/10/20 15:57:12 UTC

Build failed in Jenkins: beam_PerformanceTests_BiqQueryIO_Write_Python_Batch #141

See <https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/141/display/redirect?page=changes>

Changes:

[ajamato] Rename ProcessBundleProgressMetadataRequest to MonitoringInfosRequest.

[Robin Qiu] Ensure beam-sdks-java-bom.pom is signed during release

[noreply] Update go protocol buffers to v2. (#13115)

[noreply] [BEAM-11074] Release guide housekeeping. (#13147)


------------------------------------------
[...truncated 76.26 KB...]
 reason: 'invalid'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/batch****.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/executor.py", line 179, in execute
    op.start()
  File "dataflow_****/native_operations.py", line 38, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 39, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 44, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 54, in dataflow_****.native_operations.NativeReadOperation.start
  File "apache_beam/runners/****/operations.py", line 358, in apache_beam.runners.****.operations.Operation.output
  File "apache_beam/runners/****/operations.py", line 157, in apache_beam.runners.****.operations.ConsumerSet.receive
  File "apache_beam/runners/****/operations.py", line 717, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/****/operations.py", line 718, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'> [while running 'Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:03.427Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/batch****.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/executor.py", line 179, in execute
    op.start()
  File "dataflow_****/native_operations.py", line 38, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 39, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 44, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 54, in dataflow_****.native_operations.NativeReadOperation.start
  File "apache_beam/runners/****/operations.py", line 358, in apache_beam.runners.****.operations.Operation.output
  File "apache_beam/runners/****/operations.py", line 157, in apache_beam.runners.****.operations.ConsumerSet.receive
  File "apache_beam/runners/****/operations.py", line 717, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/****/operations.py", line 718, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'> [while running 'Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:04.702Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/batch****.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/executor.py", line 179, in execute
    op.start()
  File "dataflow_****/native_operations.py", line 38, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 39, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 44, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 54, in dataflow_****.native_operations.NativeReadOperation.start
  File "apache_beam/runners/****/operations.py", line 358, in apache_beam.runners.****.operations.Operation.output
  File "apache_beam/runners/****/operations.py", line 157, in apache_beam.runners.****.operations.ConsumerSet.receive
  File "apache_beam/runners/****/operations.py", line 717, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/****/operations.py", line 718, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'> [while running 'Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.131Z: JOB_MESSAGE_BASIC: Finished operation Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+Write to BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+Write to BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.202Z: JOB_MESSAGE_DEBUG: Executing failure step failure45
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.224Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S16:Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+Write to BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+Write to BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these ****s: 
  performance-tests-bqio-wr-10200844-0m89-harness-w261
      Root cause: Work item failed.,
  performance-tests-bqio-wr-10200844-0m89-harness-0lg6
      Root cause: Work item failed.,
  performance-tests-bqio-wr-10200844-0m89-harness-w261
      Root cause: Work item failed.,
  performance-tests-bqio-wr-10200844-0m89-harness-0lg6
      Root cause: Work item failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.297Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.457Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:05.495Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:53.833Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 5 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:53.886Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-20T15:56:53.923Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-10-20_08_44_37-11399434247216411574 is in state JOB_STATE_FAILED
INFO:apache_beam.io.gcp.tests.utils:Clean up a BigQuery table with project: apache-beam-testing, dataset: beam_performance, table: bqio_write_10GB.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1650, in wait_until_finish
    self)
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/batch****.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_****/executor.py", line 179, in execute
    op.start()
  File "dataflow_****/native_operations.py", line 38, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 39, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 44, in dataflow_****.native_operations.NativeReadOperation.start
  File "dataflow_****/native_operations.py", line 54, in dataflow_****.native_operations.NativeReadOperation.start
  File "apache_beam/runners/****/operations.py", line 358, in apache_beam.runners.****.operations.Operation.output
  File "apache_beam/runners/****/operations.py", line 157, in apache_beam.runners.****.operations.ConsumerSet.receive
  File "apache_beam/runners/****/operations.py", line 717, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/****/operations.py", line 718, in apache_beam.runners.****.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1215, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1294, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1213, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 742, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 867, in apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 604, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 508, in wait_for_bq_job
    job_reference.jobId, job.status.errorResult))
RuntimeError: BigQuery job beam_bq_job_LOAD_performancetestsbqiowritepythonbatch10gb1020150249_LOAD_STEP_657_77537339d62d56f83e6ee0bf5f6ca532_e45db797707740a29d95b1257a4f1f8b failed. Error Result: <ErrorProto
 message: 'Destination deleted/expired during operation: apache-beam-testing:beam_performance.bqio_write_10GB'
 reason: 'invalid'> [while running 'Write to BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py",> line 104, in delete_bq_table
    client.delete_table(table_ref)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/bigquery/client.py",> line 1512, in delete_table
    timeout=timeout,
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/bigquery/client.py",> line 641, in _call_api
    return call()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/api_core/retry.py",> line 286, in retry_wrapped_func
    on_error=on_error,
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/api_core/retry.py",> line 184, in retry_target
    return target()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/_http.py",> line 435, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.NotFound: 404 DELETE https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/bqio_write_10GB?prettyPrint=false: Not found: Table apache-beam-testing:beam_performance.bqio_write_10GB

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_perf_test.py",> line 109, in <module>
    BigQueryWritePerfTest().run()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 155, in run
    self.cleanup()
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_perf_test.py",> line 104, in cleanup
    self.project_id, self.output_dataset, self.output_table)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper
    return fun(*args, **kwargs)
  File "<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py",> line 106, in delete_bq_table
    raise GcpTestIOError('BigQuery table does not exist: %s' % table_ref)
apache_beam.io.gcp.tests.utils.GcpTestIOError: BigQuery table does not exist: TableReference(DatasetReference('apache-beam-testing', 'beam_performance'), 'bqio_write_10GB')

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 51s
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/v3ca3lrh6q2li

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org