You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/26 21:44:29 UTC
Build failed in Jenkins: beam_PostCommit_Python37 #5519
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5519/display/redirect?page=changes>
Changes:
[noreply] Replace distutils with supported modules. (#21968)
[noreply] Revert "Replace distutils with supported modules. " (#22453)
------------------------------------------
[...truncated 58.44 MB...]
[1m> with database.batch() as batch:[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:153:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:611: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
[31m[1m________________ SpannerWriteIntegrationTest.test_write_batches ________________[0m
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>
[1m @pytest.mark.spannerio_it[0m
[1m def test_write_batches(self):[0m
[1m _prefex = 'test_write_batches'[0m
[1m mutations = [[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])[0m
[1m ][0m
[1m [0m
[1m p = beam.Pipeline(argv=self.args)[0m
[1m _ = ([0m
[1m p | beam.Create(mutations) | WriteToSpanner([0m
[1m project_id=self.project,[0m
[1m instance_id=self.instance,[0m
[1m database_id=self.TEST_DATABASE,[0m
[1m max_batch_size_bytes=250))[0m
[1m [0m
[1m res = p.run()[0m
[1m res.wait_until_finish()[0m
[1m> self.assertEqual(self._count_data(_prefex), len(mutations))[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:88: in _count_data
[1m with database.snapshot() as snapshot:[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:649: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpd2wugkvk/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f900a6dd200> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f900a6dd9e0> ====================
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/dataflow-worker.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0726213617-531271-4kdurm23.1658871377.531459/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20220726213617532410-3036'
createTime: '2022-07-26T21:36:27.005457Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-07-26_14_36_25-13195888978815320332'
location: 'us-central1'
name: 'beamapp-jenkins-0726213617-531271-4kdurm23'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-07-26T21:36:27.005457Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-26_14_36_25-13195888978815320332]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-26_14_36_25-13195888978815320332
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-26_14_36_25-13195888978815320332?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-26_14_36_25-13195888978815320332?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-26_14_36_25-13195888978815320332 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:27.470Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-26_14_36_25-13195888978815320332. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:30.753Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-26_14_36_25-13195888978815320332.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:32.657Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.433Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.473Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.502Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.525Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.566Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.591Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.615Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.643Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.676Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.710Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.734Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.759Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.786Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.809Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.836Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:34.975Z: JOB_MESSAGE_DEBUG: Executing wait step start6
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:35.080Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:35.147Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:35.178Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:36:48.120Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:37:08.554Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:37:37.977Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:17.713Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:17.780Z: JOB_MESSAGE_DEBUG: Executing success step success4
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:17.855Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:17.902Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:17.926Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:59.758Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:59.806Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-26T21:43:59.837Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-26_14_36_25-13195888978815320332 is in state JOB_STATE_DONE
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 978.79 seconds =====[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 24m 4s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/xqmgh42yoi4ry
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Python37 #5544
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5544/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5543
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5543/display/redirect?page=changes>
Changes:
[noreply] Exclude grpcio==1.48.0 (#22539)
------------------------------------------
[...truncated 55.16 MB...]
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/examples/dataframe/flight_delays.py:47
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
apache_beam/io/gcp/bigquery_test.py:1846
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/bigquery_read_it_test.py:169
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/ml/gcp/cloud_dlp_it_test.py:77
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py:87
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
kms_key=kms_key))
apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
kms_key=self.kms_key))
apache_beam/io/gcp/bigquery_read_it_test.py:566
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py:681
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery.py:2914
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2915
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2928
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============ 80 passed, 11 skipped, 199 warnings in 7422.75 seconds ============[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============= 5 passed, 12 skipped, 5 warnings in 1251.30 seconds ==============[0m
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 352
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:xlangSpannerIOIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 26m 57s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/trzk33yxey4iy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
beam_PostCommit_Python37 - Build # 5542 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5542 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5542/ to view the results.
beam_PostCommit_Python37 - Build # 5541 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5541 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5541/ to view the results.
beam_PostCommit_Python37 - Build # 5540 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5540 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5540/ to view the results.
beam_PostCommit_Python37 - Build # 5539 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5539 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5539/ to view the results.
beam_PostCommit_Python37 - Build # 5538 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5538 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5538/ to view the results.
beam_PostCommit_Python37 - Build # 5537 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5537 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5537/ to view the results.
beam_PostCommit_Python37 - Build # 5536 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5536 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5536/ to view the results.
beam_PostCommit_Python37 - Build # 5535 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5535 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5535/ to view the results.
Build failed in Jenkins: beam_PostCommit_Python37 #5534
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5534/display/redirect>
Changes:
------------------------------------------
[...truncated 652.02 KB...]
INFO:root:Docker container is running. container_id = b'f38b198828cf4adcbb6fbd4ce23a5ec69ee6830ad96114e3fc54f18d0adb95df', worker_id = worker_0
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 917838811
}
message: "semi_persistent_directory: /tmp"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:110"
thread: "MainThread"
WARNING:root:severity: WARN
timestamp {
seconds: 1659187759
nanos: 922815322
}
message: "Discarding unparseable args: [\'--direct_runner_use_stacked_bundle\', \'--pipeline_type_check\']"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/options/pipeline_options.py:339"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 924576759
}
message: "Pipeline_options: {\'experiments\': [\'beam_fn_api\'], \'requirements_file\': \'/tmp/tmp5igar8wq/requirements.txt\', \'save_main_session\': True, \'sdk_location\': \'container\', \'job_endpoint\': \'embed\', \'environment_type\': \'DOCKER\', \'sdk_worker_parallelism\': \'1\', \'environment_cache_millis\': \'0\'}"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:128"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 928173065
}
message: "Creating state cache with size 0"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/statecache.py:172"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 928527355
}
message: "Creating insecure control channel for localhost:39505."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:181"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 932955503
}
message: "Control channel established."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:189"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 933459043
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:232"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 935981512
}
message: "Python sdk harness starting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:182"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 939393043
}
message: "Creating insecure state channel for localhost:42921."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:858"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 939646005
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:865"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1659187759
nanos: 941092252
}
message: "Creating client data channel for localhost:39387"
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:772"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8015871
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8207798
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8291721
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:805"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8367300
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:877"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8566617
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1659187760
nanos: 8642673
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:184"
thread: "MainThread"
f38b198828cf4adcbb6fbd4ce23a5ec69ee6830ad96114e3fc54f18d0adb95df
INFO:apache_beam.runners.portability.local_job_service:Completed job in 9.315286636352539 seconds with state DONE.
INFO:root:Completed job in 9.315286636352539 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
FAILURE: Build completed with 5 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 182
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
5: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 120
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:mongodbioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 8m 59s
217 actionable tasks: 147 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/tkw2oxic3ef2u
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
beam_PostCommit_Python37 - Build # 5533 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5533 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5533/ to view the results.
beam_PostCommit_Python37 - Build # 5532 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5532 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5532/ to view the results.
beam_PostCommit_Python37 - Build # 5531 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5531 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5531/ to view the results.
beam_PostCommit_Python37 - Build # 5530 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5530 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5530/ to view the results.
beam_PostCommit_Python37 - Build # 5529 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5529 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5529/ to view the results.
beam_PostCommit_Python37 - Build # 5528 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 5528 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/5528/ to view the results.
Build failed in Jenkins: beam_PostCommit_Python37 #5527
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5527/display/redirect?page=changes>
Changes:
[chamikaramj] Remove unnecessary reference to use_runner_v2 experiment in x-lang
[bulat.safiullin] [Website] remove beam-summit 2022 container with all related files
[yixiaoshen] Fix typo in Datastore V1ReadIT test
[noreply] Add read/write PubSub integration example fhirio pipeline (#22306)
[noreply] Remove deprecated Session runner (#22505)
[noreply] Add Go test status to the PR template (#22508)
------------------------------------------
[...truncated 58.33 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
apache_beam/examples/dataframe/flight_delays.py:47
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py:169
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_test.py:1846
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
kms_key=kms_key))
apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
kms_key=self.kms_key))
apache_beam/ml/gcp/cloud_dlp_it_test.py:77
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py:87
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/io/gcp/bigquery_read_it_test.py:566
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py:681
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery.py:2914
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2915
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2928
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7420.18 seconds =======[0m
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
[gw4] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw4] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw4] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw4] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw4] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============= 5 passed, 12 skipped, 5 warnings in 1409.27 seconds ==============[0m
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 29m 6s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/jfapetyqcy7oi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5526
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5526/display/redirect>
Changes:
------------------------------------------
[...truncated 58.56 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
apache_beam/examples/dataframe/flight_delays.py:47
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py:169
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_test.py:1846
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
kms_key=kms_key))
apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
kms_key=self.kms_key))
apache_beam/ml/gcp/cloud_dlp_it_test.py:77
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py:87
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/io/gcp/bigquery_read_it_test.py:566
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py:681
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery.py:2914
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2915
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2928
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m======= 3 failed, 77 passed, 11 skipped, 199 warnings in 7543.77 seconds =======[0m
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
[gw2] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw2] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw2] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw2] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw2] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============= 5 passed, 12 skipped, 5 warnings in 1366.18 seconds ==============[0m
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 30m 28s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/hz6kbz3rwiwim
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5525
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5525/display/redirect>
Changes:
------------------------------------------
[...truncated 58.36 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/examples/dataframe/flight_delays.py:47
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
apache_beam/io/gcp/bigquery_read_it_test.py:169
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_test.py:1846
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
kms_key=kms_key))
apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
kms_key=self.kms_key))
apache_beam/ml/gcp/cloud_dlp_it_test.py:77
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py:87
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/io/gcp/bigquery_read_it_test.py:566
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py:681
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery.py:2914
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2915
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2928
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7161.88 seconds =======[0m
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============= 5 passed, 12 skipped, 5 warnings in 1419.64 seconds ==============[0m
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 25m 7s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/iefh4fvnrllxw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5524
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5524/display/redirect?page=changes>
Changes:
[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read
[noreply] Upgrades pip before installing Beam for Python default expansion service
[noreply] [Go SDK]: Plumb allowed lateness to execution (#22476)
[Valentyn Tymofieiev] Restrict google-api-core
[Valentyn Tymofieiev] Regenerate the container dependencies.
[noreply] Replace distutils with supported modules. (#22456)
[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)
[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.
[noreply] Remove stripping of step name. Replace removing only suffix step name
------------------------------------------
[...truncated 58.40 MB...]
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
apache_beam/io/fileio.py:581
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:581: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
apache_beam/io/gcp/bigquery.py:2783
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2783: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
apache_beam/io/gcp/bigquery.py:2811
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2811: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
apache_beam/examples/dataframe/flight_delays.py:47
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
apache_beam/io/gcp/bigquery_read_it_test.py:169
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:169: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_test.py:1846
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1846: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
kms_key=kms_key))
apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
kms_key=self.kms_key))
apache_beam/ml/gcp/cloud_dlp_it_test.py:77
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
inspection_config=INSPECT_CONFIG))
apache_beam/ml/gcp/cloud_dlp_it_test.py:87
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
| beam.ParDo(extract_inspection_results).with_outputs(
apache_beam/io/gcp/bigquery_read_it_test.py:566
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:566: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))
apache_beam/io/gcp/bigquery_read_it_test.py:681
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:681: FutureWarning: ReadAllFromBigQuery is experimental.
| beam.io.ReadAllFromBigQuery())
apache_beam/io/gcp/bigquery.py:2914
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2914: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/io/gcp/bigquery.py:2915
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2915: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project
apache_beam/io/gcp/bigquery.py:2928
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2928: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m======= 1 failed, 79 passed, 11 skipped, 199 warnings in 7172.15 seconds =======[0m
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.42.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0m[1m[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw1] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw0] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[33m[1m============= 5 passed, 12 skipped, 5 warnings in 1203.41 seconds ==============[0m
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 24m 49s
227 actionable tasks: 185 executed, 36 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/wcm32gnemd5xk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5523
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5523/display/redirect?page=changes>
Changes:
[noreply] 21730 fix offset resetting (#22450)
[noreply] Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)
------------------------------------------
[...truncated 58.58 MB...]
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
[31m[1m________________ SpannerWriteIntegrationTest.test_write_batches ________________[0m
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>
[1m @pytest.mark.spannerio_it[0m
[1m def test_write_batches(self):[0m
[1m _prefex = 'test_write_batches'[0m
[1m mutations = [[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])[0m
[1m ][0m
[1m [0m
[1m p = beam.Pipeline(argv=self.args)[0m
[1m _ = ([0m
[1m p | beam.Create(mutations) | WriteToSpanner([0m
[1m project_id=self.project,[0m
[1m instance_id=self.instance,[0m
[1m database_id=self.TEST_DATABASE,[0m
[1m max_batch_size_bytes=250))[0m
[1m [0m
[1m res = p.run()[0m
[1m res.wait_until_finish()[0m
[1m> self.assertEqual(self._count_data(_prefex), len(mutations))[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:88: in _count_data
[1m with database.snapshot() as snapshot:[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:649: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp_0kuke33/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f91891ba200> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f91891ba9e0> ====================
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/dataflow-worker.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727213704-121308-4kdurm23.1658957824.121495/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20220727213704122486-3036'
createTime: '2022-07-27T21:37:12.251333Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-07-27_14_37_11-1466142974706318720'
location: 'us-central1'
name: 'beamapp-jenkins-0727213704-121308-4kdurm23'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-07-27T21:37:12.251333Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-27_14_37_11-1466142974706318720]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-27_14_37_11-1466142974706318720
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_14_37_11-1466142974706318720?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_14_37_11-1466142974706318720?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_14_37_11-1466142974706318720 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:12.877Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-27_14_37_11-1466142974706318720. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:13.333Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-27_14_37_11-1466142974706318720.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:15.750Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.056Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.100Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.136Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.171Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.216Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.244Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.274Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.306Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.339Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.373Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.445Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.490Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.515Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.549Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.584Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.718Z: JOB_MESSAGE_DEBUG: Executing wait step start6
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.782Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.830Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:18.880Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:47.724Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:37:54.085Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:38:21.627Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.153Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.275Z: JOB_MESSAGE_DEBUG: Executing success step success4
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.393Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.441Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:43:45.496Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.059Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.111Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T21:44:22.147Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_14_37_11-1466142974706318720 is in state JOB_STATE_DONE
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 925.57 seconds =====[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 121
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 24m 13s
227 actionable tasks: 161 executed, 60 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/zzwlzqbugd7aa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5522
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5522/display/redirect>
Changes:
------------------------------------------
[...truncated 58.43 MB...]
[1m> with database.batch() as batch:[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:153:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:611: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
[31m[1m________________ SpannerWriteIntegrationTest.test_write_batches ________________[0m
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>
[1m @pytest.mark.spannerio_it[0m
[1m def test_write_batches(self):[0m
[1m _prefex = 'test_write_batches'[0m
[1m mutations = [[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])[0m
[1m ][0m
[1m [0m
[1m p = beam.Pipeline(argv=self.args)[0m
[1m _ = ([0m
[1m p | beam.Create(mutations) | WriteToSpanner([0m
[1m project_id=self.project,[0m
[1m instance_id=self.instance,[0m
[1m database_id=self.TEST_DATABASE,[0m
[1m max_batch_size_bytes=250))[0m
[1m [0m
[1m res = p.run()[0m
[1m res.wait_until_finish()[0m
[1m> self.assertEqual(self._count_data(_prefex), len(mutations))[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:88: in _count_data
[1m with database.snapshot() as snapshot:[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:649: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpjj06e1zj/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f5391fdb200> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f5391fdb9e0> ====================
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/dataflow-worker.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727153856-754020-4kdurm23.1658936336.754209/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20220727153856755130-3036'
createTime: '2022-07-27T15:39:05.236630Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-07-27_08_39_04-1233593834374540911'
location: 'us-central1'
name: 'beamapp-jenkins-0727153856-754020-4kdurm23'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-07-27T15:39:05.236630Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-27_08_39_04-1233593834374540911]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-27_08_39_04-1233593834374540911
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_08_39_04-1233593834374540911?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_08_39_04-1233593834374540911?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_08_39_04-1233593834374540911 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:05.742Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-27_08_39_04-1233593834374540911. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:10.806Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-27_08_39_04-1233593834374540911.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:14.249Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.184Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.215Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.252Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.287Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.325Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.355Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.449Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.472Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.496Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.529Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.582Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.616Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.647Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.680Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.714Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.873Z: JOB_MESSAGE_DEBUG: Executing wait step start6
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:16.942Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:17.048Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:17.076Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:48.994Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:39:53.414Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:40:23.519Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:09.911Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:09.957Z: JOB_MESSAGE_DEBUG: Executing success step success4
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:10.027Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:10.083Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:10.103Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:53.467Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:53.513Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T15:46:53.537Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_08_39_04-1233593834374540911 is in state JOB_STATE_DONE
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 967.46 seconds =====[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 26m 44s
227 actionable tasks: 157 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/qqzli5tpfr6ca
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5521
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5521/display/redirect>
Changes:
------------------------------------------
[...truncated 58.41 MB...]
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
[31m[1m________________ SpannerWriteIntegrationTest.test_write_batches ________________[0m
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>
[1m @pytest.mark.spannerio_it[0m
[1m def test_write_batches(self):[0m
[1m _prefex = 'test_write_batches'[0m
[1m mutations = [[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])[0m
[1m ][0m
[1m [0m
[1m p = beam.Pipeline(argv=self.args)[0m
[1m _ = ([0m
[1m p | beam.Create(mutations) | WriteToSpanner([0m
[1m project_id=self.project,[0m
[1m instance_id=self.instance,[0m
[1m database_id=self.TEST_DATABASE,[0m
[1m max_batch_size_bytes=250))[0m
[1m [0m
[1m res = p.run()[0m
[1m res.wait_until_finish()[0m
[1m> self.assertEqual(self._count_data(_prefex), len(mutations))[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:88: in _count_data
[1m with database.snapshot() as snapshot:[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:649: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpeb6rwtdm/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f4ad01bd200> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f4ad01bd9e0> ====================
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/pbr-5.9.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/pbr-5.9.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/mock-2.0.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/mock-2.0.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/six-1.16.0.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/six-1.16.0.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/PyHamcrest-1.10.1.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/parameterized-0.7.5.tar.gz...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/parameterized-0.7.5.tar.gz in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/matplotlib-3.5.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/matplotlib-3.5.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/dataflow-worker.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727093952-768680-m23eerbm.1658914792.768886/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20220727093952769887-3550'
createTime: '2022-07-27T09:40:03.721301Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-07-27_02_40_01-11545245067662907333'
location: 'us-central1'
name: 'beamapp-jenkins-0727093952-768680-m23eerbm'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-07-27T09:40:03.721301Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-27_02_40_01-11545245067662907333]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-27_02_40_01-11545245067662907333
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_02_40_01-11545245067662907333?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_02_40_01-11545245067662907333?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_02_40_01-11545245067662907333 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:04.150Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-27_02_40_01-11545245067662907333. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:04.263Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-27_02_40_01-11545245067662907333.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:05.957Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.658Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.693Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.727Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.767Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.807Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.831Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.861Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.883Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.913Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.952Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:07.984Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.026Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.062Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.099Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.136Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.255Z: JOB_MESSAGE_DEBUG: Executing wait step start6
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.324Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.369Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:08.393Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:44.413Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:40:47.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:41:15.255Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:46:37.903Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:46:37.996Z: JOB_MESSAGE_DEBUG: Executing success step success4
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:46:38.072Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:46:38.140Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:46:38.184Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:47:11.873Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:47:11.927Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T09:47:11.953Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-27_02_40_01-11545245067662907333 is in state JOB_STATE_DONE
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 924.00 seconds =====[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 26m 55s
227 actionable tasks: 164 executed, 57 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/zdsc2o37pglc4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PostCommit_Python37 #5520
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/5520/display/redirect?page=changes>
Changes:
[chamikaramj] Adds KV support for the Java RunInference transform.
[noreply] Enable configuration to avoid successfully written Table Row propagation
[noreply] lint fixes for recent import (#22455)
[noreply] Bump Python Combine LoadTests timeout to 12 hours (#22439)
[noreply] convert windmill min timestamp to beam min timestamp (#21915)
[noreply] [CdapIO] Fixed necessary warnings (#22399)
[noreply] [#22051]: Add read_time support to Google Cloud Datastore connector
------------------------------------------
[...truncated 58.37 MB...]
[1m> with database.batch() as batch:[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:153:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:611: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
[31m[1m________________ SpannerWriteIntegrationTest.test_write_batches ________________[0m
[gw1] linux -- Python 3.7.12 <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7>
self = <apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest testMethod=test_write_batches>
[1m @pytest.mark.spannerio_it[0m
[1m def test_write_batches(self):[0m
[1m _prefex = 'test_write_batches'[0m
[1m mutations = [[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '1', _prefex + 'inset-1')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '2', _prefex + 'inset-2')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '3', _prefex + 'inset-3')]),[0m
[1m WriteMutation.insert([0m
[1m 'Users', ('UserId', 'Key'), [(_prefex + '4', _prefex + 'inset-4')])[0m
[1m ][0m
[1m [0m
[1m p = beam.Pipeline(argv=self.args)[0m
[1m _ = ([0m
[1m p | beam.Create(mutations) | WriteToSpanner([0m
[1m project_id=self.project,[0m
[1m instance_id=self.instance,[0m
[1m database_id=self.TEST_DATABASE,[0m
[1m max_batch_size_bytes=250))[0m
[1m [0m
[1m res = p.run()[0m
[1m res.wait_until_finish()[0m
[1m> self.assertEqual(self._count_data(_prefex), len(mutations))[0m
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/io/gcp/experimental/spannerio_write_it_test.py[0m:88: in _count_data
[1m with database.snapshot() as snapshot:[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:649: in __enter__
[1m session = self._session = self._database._pool.get()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py[0m:273: in get
[1m session.create()[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py[0m:113: in create
[1m api = self._database.spanner_api[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py[0m:235: in spanner_api
[1m client_options=client_options,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py[0m:194: in __init__
[1m address=api_endpoint, channel=channel, credentials=credentials[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:77: in __init__
[1m "grpc.keepalive_time_ms": _GRPC_KEEPALIVE_MS,[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/transports/spanner_grpc_transport.py[0m:114: in create_channel
[1m address, credentials=credentials, scopes=cls._OAUTH_SCOPES, **kwargs[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py[0m:297: in create_channel
[1m return grpc.secure_channel(target, composite_credentials, **kwargs)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/__init__.py[0m:2005: in secure_channel
[1m credentials._credentials, compression)[0m
[1m[31m../../build/gradleenv/-1734967052/lib/python3.7/site-packages/grpc/_channel.py[0m:1480: in __init__
[1m credentials)[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/channel.pyx.pxi[0m:454: in grpc._cython.cygrpc.Channel.__cinit__
[1m ???[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:76: in grpc._cython.cygrpc._ChannelArgs.__cinit__
[1m ???[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m> ???[0m
[1m[31mE TypeError: Expected int, bytes, or behavior, got <class 'grpc_gcp_pb2.ApiConfig'>[0m
[1m[31msrc/python/grpcio/grpc/_cython/_cygrpc/arguments.pyx.pxi[0m:60: TypeError
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpln2a1h3h/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.41.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20220617" for Docker environment
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function pack_combiners at 0x7f6c559f6200> ====================
[32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:714 ==================== <function sort_stages at 0x7f6c559f69e0> ====================
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:573 Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/seaborn-0.11.2-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/matplotlib-3.5.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/matplotlib-3.5.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/dataflow-worker.jar in 6 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0727034347-559967-4kdurm23.1658893427.560156/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20220727034347561093-3036'
createTime: '2022-07-27T03:43:56.566974Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-07-26_20_43_56-16902464161539956040'
location: 'us-central1'
name: 'beamapp-jenkins-0727034347-559967-4kdurm23'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-07-27T03:43:56.566974Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-07-26_20_43_56-16902464161539956040]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-07-26_20_43_56-16902464161539956040
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-26_20_43_56-16902464161539956040?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-26_20_43_56-16902464161539956040?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-26_20_43_56-16902464161539956040 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:43:57.052Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-07-26_20_43_56-16902464161539956040. The number of workers will be between 1 and 1000.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:43:57.268Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-07-26_20_43_56-16902464161539956040.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:43:59.064Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.295Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.327Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.360Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.395Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.435Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.471Z: JOB_MESSAGE_DETAILED: Unzipping flatten s5 for input s3.unbatchable
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.494Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToSpanner/Writing to spanner, through flatten WriteToSpanner/make batches/Merging batchable and unbatchable, into producer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.540Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/Writing to spanner into WriteToSpanner/make batches/ParDo(_BatchFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.560Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Making mutation groups into Create/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.587Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn) into WriteToSpanner/make batches/Making mutation groups
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.611Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToSpanner/make batches/ParDo(_BatchFn) into WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.648Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.710Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.739Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.769Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.903Z: JOB_MESSAGE_DEBUG: Executing wait step start6
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:00.977Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:01.022Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:01.051Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:18.810Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:44:40.023Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:45:06.435Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:09.750Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+WriteToSpanner/make batches/Making mutation groups+WriteToSpanner/make batches/Filtering Batchable Mutations/ParDo(_BatchableFilterFn)+WriteToSpanner/Writing to spanner+WriteToSpanner/make batches/ParDo(_BatchFn)+WriteToSpanner/Writing to spanner
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:09.824Z: JOB_MESSAGE_DEBUG: Executing success step success4
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:09.900Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:09.945Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:09.997Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:51.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:51.526Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-07-27T03:50:51.560Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-07-26_20_43_56-16902464161539956040 is in state JOB_STATE_DONE
[33m=============================== warnings summary ===============================[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
[31m[1m==== 2 failed, 1 passed, 4 skipped, 2 warnings, 10 error in 923.85 seconds =====[0m
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 73
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 165
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:spannerioIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 30m 40s
227 actionable tasks: 192 executed, 29 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/oiwovkr7bf3y6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org