You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/27 21:44:38 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #5523

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5523/display/redirect?page=changes>

Changes:

[noreply] 21730 fix offset resetting (#22450)

[noreply] Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)


------------------------------------------
[...truncated 58.58 MB...]
    "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py:114: in create_channel
    address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:297: in create_channel
    return grpc.secure_channel(target, composite_credentials, **kwargs)
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py:2005: in secure_channel
    credentials._credentials, compression)
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py:1480: in __init__
    credentials)
src/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi:454: in grpc._cython.cygrpc.Channel.__cinit__
    ???
src/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>

src/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi:60: TypeError
________________ SpannerWriteIntegrationTest.test_write_batches ________________
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>

self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>

    @pytest.mark.spannerio_it
    def test_write_batches(self):
      _prefex = 'test_write_batches'
      mutations = [
          WriteMutation.insert(
              'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),
          WriteMutation.insert(
              'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),
          WriteMutation.insert(
              'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),
          WriteMutation.insert(
              'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])
      ]
    
      p = beam.Pipeline(argv=self.args)
      _ = (
          p | beam.Create(mutations) | WriteToSpanner(
              project_id=self.project,
              instance_id=self.instance,
              database_id=self.TEST_DATABASE,
              max_batch_size_bytes=250))
    
      res = p.run()
      res.wait_until_finish()
>     self.assertEqual(self._count_data(_prefex), len(mutations))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:139: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:88: in _count_data
    with database.snapshot() as snapshot:
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py:649: in __enter__
    session = self._session = self._database._pool.get()
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py:273: in get
    session.create()
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py:113: in create
    api = self._database.spanner_api
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py:235: in spanner_api
    client_options=client_options,
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py:194: in __init__
    address=api_endpoint, channel=channel, credentials=credentials
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py:77: in __init__
    "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py:114: in create_channel
    address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:297: in create_channel
    return grpc.secure_channel(target, composite_credentials, **kwargs)
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py:2005: in secure_channel
    credentials._credentials, compression)
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py:1480: in __init__
    credentials)
src/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi:454: in grpc._cython.cygrpc.Channel.__cinit__
    ???
src/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>

src/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi:60: TypeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp_0kuke33/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO     root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
INFO     root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f91891ba200> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f91891ba9e0> ====================
WARNING  apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/requirements.txt...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/requirements.txt in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/mock-2.0.0-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/seaborn-0.11.2-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/beautifulsoup4-4.11.1-py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow-worker.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow-worker.jar in 5 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
                                                                           clientRequestId: '20220727213704122486-3036'
                                                                           createTime: '2022-07-27T21:37:12.251333Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2022-07-27_14_37_11-1466142974706318720'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-0727213704-121308-4kdurm23'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2022-07-27T21:37:12.251333Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-27_14_37_11-1466142974706318720]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-27_14_37_11-1466142974706318720
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_14_37_11-1466142974706318720?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: 
INFO     apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_14_37_11-1466142974706318720?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_14_37_11-1466142974706318720 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:12.877Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-27_14_37_11-1466142974706318720. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:13.333Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-27_14_37_11-1466142974706318720.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:15.750Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.056Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.100Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.136Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.171Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.216Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.244Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.274Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.306Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.339Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.373Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.445Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.490Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.515Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.549Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.584Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.718Z: JOB_MESSAGE_DEBUG: Executing wait step start6
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.782Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.830Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.880Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:47.724Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:54.085Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:38:21.627Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.153Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.275Z: JOB_MESSAGE_DEBUG: Executing success step success4
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.393Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.441Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.496Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.059Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.111Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.147Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_14_37_11-1466142974706318720 is in state JOB_STATE_DONE
=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 925.57 seconds =====

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 24m 13s
227 actionable tasks: 161 executed, 60 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/zzwlzqbugd7aa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #5544

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5544/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #5543

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5543/display/redirect?page=changes>

Changes:

[noreply] Exclude grpcio==1.48.0 (#22539)


------------------------------------------
[...truncated 55.16 MB...]
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/examples/dataframe/flight_delays.py:47
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/io/gcp/bigquery_test.py:1846
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:169
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:566
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:681
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2914
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2915
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2928
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============ 80 passed, 11 skipped, 199 warnings in 7422.75 seconds ============

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 12 skipped, 5 warnings in 1251.30 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 352

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:xlangSpannerIOIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 26m 57s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/trzk33yxey4iy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 5542 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5542 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5542/ to view the results.

beam_PostCommit_Python37 - Build # 5541 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5541 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5541/ to view the results.

beam_PostCommit_Python37 - Build # 5540 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5540 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5540/ to view the results.

beam_PostCommit_Python37 - Build # 5539 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5539 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5539/ to view the results.

beam_PostCommit_Python37 - Build # 5538 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5538 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5538/ to view the results.

beam_PostCommit_Python37 - Build # 5537 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5537 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5537/ to view the results.

beam_PostCommit_Python37 - Build # 5536 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5536 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5536/ to view the results.

beam_PostCommit_Python37 - Build # 5535 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5535 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5535/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #5534

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5534/display/redirect>

Changes:


------------------------------------------
[...truncated 652.02 KB...]
INFO:root:Docker container is running. container_id = b'f38b198828cf4adcbb6fbd4ce23a5ec69ee6830ad96114e3fc54f18d0adb95df', worker_id = worker_0
INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 917838811
}
message: "semi_persistent_directory: /tmp"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110"
thread: "MainThread"

WARNING:root:severity: WARN
timestamp {
  seconds: 1659187759
  nanos: 922815322
}
message: "Discarding unparseable args: [\'--direct_runner_use_stacked_bundle\', \'--pipeline_type_check\']"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/options/pipeline_options.py:339"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 924576759
}
message: "Pipeline_options: {\'experiments\': [\'beam_fn_api\'], \'requirements_file\': \'/tmp/tmp5igar8wq/requirements.txt\', \'save_main_session\': True, \'sdk_location\': \'container\', \'job_endpoint\': \'embed\', \'environment_type\': \'DOCKER\', \'sdk_worker_parallelism\': \'1\', \'environment_cache_millis\': \'0\'}"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:128"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 928173065
}
message: "Creating state cache with size 0"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/statecache.py:172"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 928527355
}
message: "Creating insecure control channel for localhost:39505."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:181"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 932955503
}
message: "Control channel established."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:189"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 933459043
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:232"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 935981512
}
message: "Python sdk harness starting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:182"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 939393043
}
message: "Creating insecure state channel for localhost:42921."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:858"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 939646005
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:865"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187759
  nanos: 941092252
}
message: "Creating client data channel for localhost:39387"
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:772"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8015871
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8207798
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8291721
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:805"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8367300
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:877"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8566617
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1659187760
  nanos: 8642673
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:184"
thread: "MainThread"

f38b198828cf4adcbb6fbd4ce23a5ec69ee6830ad96114e3fc54f18d0adb95df
INFO:apache_beam.runners.portability.local_job_service:Completed job in 9.315286636352539 seconds with state DONE.
INFO:root:Completed job in 9.315286636352539 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:mongodbioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 8m 59s
217 actionable tasks: 147 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/tkw2oxic3ef2u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 5533 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5533 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5533/ to view the results.

beam_PostCommit_Python37 - Build # 5532 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5532 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5532/ to view the results.

beam_PostCommit_Python37 - Build # 5531 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5531 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5531/ to view the results.

beam_PostCommit_Python37 - Build # 5530 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5530 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5530/ to view the results.

beam_PostCommit_Python37 - Build # 5529 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5529 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5529/ to view the results.

beam_PostCommit_Python37 - Build # 5528 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5528 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5528/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #5527

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5527/display/redirect?page=changes>

Changes:

[chamikaramj] Remove unnecessary reference to use_runner_v2 experiment in x-lang

[bulat.safiullin] [Website] remove beam-summit 2022 container with all related files

[yixiaoshen] Fix typo in Datastore V1ReadIT test

[noreply] Add read/write PubSub integration example fhirio pipeline (#22306)

[noreply] Remove deprecated Session runner (#22505)

[noreply] Add Go test status to the PR template (#22508)


------------------------------------------
[...truncated 58.33 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/examples/dataframe/flight_delays.py:47
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py:169
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_test.py:1846
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:566
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:681
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2914
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2915
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2928
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7420.18 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw4] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw4] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 12 skipped, 5 warnings in 1409.27 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 29m 6s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/jfapetyqcy7oi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #5526

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5526/display/redirect>

Changes:


------------------------------------------
[...truncated 58.56 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/examples/dataframe/flight_delays.py:47
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py:169
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_test.py:1846
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:566
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:681
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2914
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2915
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2928
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 77 passed, 11 skipped, 199 warnings in 7543.77 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 12 skipped, 5 warnings in 1366.18 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 30m 28s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/hz6kbz3rwiwim

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #5525

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5525/display/redirect>

Changes:


------------------------------------------
[...truncated 58.36 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/examples/dataframe/flight_delays.py:47
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/io/gcp/bigquery_read_it_test.py:169
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_test.py:1846
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:566
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:681
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2914
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2915
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2928
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7161.88 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 12 skipped, 5 warnings in 1419.64 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 25m 7s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/iefh4fvnrllxw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #5524

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5524/display/redirect?page=changes>

Changes:

[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read

[noreply] Upgrades pip before installing Beam for Python default expansion service

[noreply] [Go SDK]: Plumb allowed lateness to execution (#22476)

[Valentyn Tymofieiev] Restrict google-api-core

[Valentyn Tymofieiev] Regenerate the container dependencies.

[noreply] Replace distutils with supported modules. (#22456)

[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)

[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.

[noreply] Remove stripping of step name. Replace removing only suffix step name


------------------------------------------
[...truncated 58.40 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/examples/dataframe/flight_delays.py:47
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py:169
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_test.py:1846
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:566
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:681
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2914
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2915
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2928
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7172.15 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 12 skipped, 5 warnings in 1203.41 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 24m 49s
227 actionable tasks: 185 executed, 36 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/wcm32gnemd5xk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org