You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/12/17 20:23:51 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #4648

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4648/display/redirect>

Changes:


------------------------------------------
[...truncated 50.94 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============ 60 passed, 9 skipped, 169 warnings in 6514.71 seconds =============

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1737.71 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 23m 18s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/htqakpaxewowe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #4651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4651/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4650/display/redirect>

Changes:


------------------------------------------
[...truncated 25.70 MB...]
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:32 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:33 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:33 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1218061631-8c990f17_a35adf69-c4ff-425f-9e06-5a333629af1d on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40373.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:39803.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:34537
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1218061631-8c990f17_a35adf69-c4ff-425f-9e06-5a333629af1d: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/18 06:16:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1218061631-8c990f17_a35adf69-c4ff-425f-9e06-5a333629af1d finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639808195.732435840","description":"Error received from peer ipv4:127.0.0.1:34537","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639808195.732435840","description":"Error received from peer ipv4:127.0.0.1:34537","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639808195.732461666","description":"Error received from peer ipv4:127.0.0.1:40373","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639808195.732449864","description":"Error received from peer ipv4:127.0.0.1:39803","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639808235.601362542","description":"Error received from peer ipv4:127.0.0.1:38165","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1656.55 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 32m 6s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/yygioalqsrtj6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4649/display/redirect>

Changes:


------------------------------------------
[...truncated 29.75 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac0TzRPNEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:3, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8bf5179758fb6719d125b5dbf4907919, jobId: 5f9d67963fb14f5578d3504b80ebf5c0).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:6, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 04f4b8307039123734610c1c2aed6740, jobId: 5f9d67963fb14f5578d3504b80ebf5c0).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:1, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 7bafc5c24888e9069cbe0ded5a74b4ba, jobId: 5f9d67963fb14f5578d3504b80ebf5c0).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:13, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: dac07e413ae81d350ebc4fbf2d118030, jobId: 5f9d67963fb14f5578d3504b80ebf5c0).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job 5f9d67963fb14f5578d3504b80ebf5c0 with leader id b50f3b1d5a9d8dca97fb7fdab19c423e lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-9a9daa34-3645-44a8-b0b0-f2dbd92832c4'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-8ffc9377-f8d7-40b2-84f5-d98f9fab23c8'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-454b9dba-1089-48f6-b40c-5dd9479a4e70'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:45905'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 18, 2021 12:18:34 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 243.86 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639786513.287152130","description":"Error received from peer ipv4:127.0.0.1:33673","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639786676.352303465","description":"Error received from peer ipv4:127.0.0.1:45399","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1729.50 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 36m 13s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/tmaselclmkyss

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org