You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/03/10 00:11:44 UTC

Build failed in Jenkins: beam_PostCommit_Python39 #104

See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/104/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16844 from [BEAM-12164]: allow for nanosecond

[noreply] [BEAM-13904] Increase unit testing in the reflectx package (#17024)


------------------------------------------
[...truncated 4.98 MB...]
apache_beam/io/gcp/bigquery.py:2134
apache_beam/io/gcp/bigquery.py:2134
apache_beam/io/gcp/bigquery.py:2134
apache_beam/io/gcp/bigquery.py:2134
apache_beam/io/gcp/bigquery.py:2134
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
apache_beam/io/gcp/bigquery.py:2138
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
apache_beam/io/gcp/bigquery.py:2144
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
apache_beam/io/gcp/bigquery_file_loads.py:1128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:81
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:81: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(

apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
apache_beam/io/gcp/bigquery.py:2481
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2481: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
apache_beam/io/gcp/bigquery.py:2502
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2502: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2611
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2611: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2612
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2612: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2625
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2625: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/_gapic.py>:41
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/_gapic.py>:41
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/_gapic.py>:41
  <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/1398941893/lib/python3.9/site-packages/google/cloud/datastore/_gapic.py>:41: PendingDeprecationWarning: The `channel` argument is deprecated; use `transport` instead.
    return datastore_client.DatastoreClient(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-direct-py39.xml> -
============= 32 passed, 1 skipped, 119 warnings in 140.42 seconds =============

> Task :sdks:python:test-suites:portable:py39:postCommitPy39IT
WARNING  root:subprocess_server.py:95 Waiting for grpc channel to be ready at localhost:57289.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:23 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Staging artifacts for job_97b507c7-0151-4170-84e7-a84236f78d1e.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:23 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 resolveNextEnvironment'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Resolving artifacts for job_97b507c7-0151-4170-84e7-a84236f78d1e.ref_Environment_default_environment_1.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:23 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Getting 0 artifacts for job_97b507c7-0151-4170-84e7-a84236f78d1e.null.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:23 AM org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 finishStaging'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Artifacts fully staged for job_97b507c7-0151-4170-84e7-a84236f78d1e.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.runners.flink.FlinkJobInvoker invokeWithExecutor'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Invoking job BeamApp-jenkins-0310000823-b8733af5_7f087979-2c02-4fc0-9477-44a6800c5a81 with pipeline runner org.apache.beam.runners.flink.FlinkPipelineRunner@2c70d197'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.runners.jobsubmission.JobInvocation start'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Starting job invocation BeamApp-jenkins-0310000823-b8733af5_7f087979-2c02-4fc0-9477-44a6800c5a81'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:451 Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
                                                                                  with Pipeline() as p:
                                                                                    p.apply(..)
                                                                                This ensures that the pipeline finishes before this program exits.
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:574 Job state changed to STOPPED
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.runners.flink.FlinkPipelineRunner runPipelineWithTranslator'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Translating pipeline to Flink program.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:574 Job state changed to STARTING
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:574 Job state changed to RUNNING
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.runners.flink.FlinkExecutionEnvironments createBatchExecutionEnvironment'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Creating a Batch Execution Environment.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM software.amazon.awssdk.regions.internal.util.EC2MetadataUtils getItems'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'WARNING: Unable to retrieve the requested metadata.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:24 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Mar 10, 2022 12:08:25 AM org.apache.beam.sdk.io.aws2.s3.DefaultS3ClientBuilderFactory createBuilder'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: The AWS S3 Beam extension was included in this build, but the awsRegion flag was not specified. If you don't plan to use S3, then ignore this message."

> Task :sdks:python:container:py39:docker
.......................................................................................................................done.

Update done!


To help improve the quality of this product, we collect anonymized usage data
and anonymized stacktraces when crashes are encountered; additional information
is available at <https://cloud.google.com/sdk/usage-statistics>. This data is
handled in accordance with our privacy policy
<https://cloud.google.com/terms/cloud-privacy-notice>. You may choose to opt in this
collection now (by choosing 'Y' at the below prompt), or at any time in the
future by running the following command:

    gcloud config set disable_usage_reporting false


This will install all the core command line tools necessary for working with
the Google Cloud Platform.

==> Source [/usr/local/gcloud/google-cloud-sdk/completion.bash.inc] in your profile to enable shell command completion for gcloud.
==> Source [/usr/local/gcloud/google-cloud-sdk/path.bash.inc] in your profile to add the Google Cloud SDK command line tools to your $PATH.

For more information on how to get started, please visit:
  https://cloud.google.com/sdk/docs/quickstarts


Removing intermediate container 72914d084b5f
 ---> b6bd31df4df3
Step 10/31 : RUN ln -s /usr/bin/ccache /usr/local/bin/gcc
 ---> Running in 766a9d3af01e
Removing intermediate container 766a9d3af01e
 ---> 778c89381a05
FAILURE: Build failed with an exception.

* What went wrong:
GC overhead limit exceeded

* Try:
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Exception is:
java.lang.OutOfMemoryError: GC overhead limit exceeded


* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python39 #105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python39/105/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org