You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/12/15 08:45:33 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #4638

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4638/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13434] Bump log4j to 2.16.0. (#16237)


------------------------------------------
[...truncated 34.41 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 4 failed, 57 passed, 8 skipped, 173 warnings in 7921.18 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1704.81 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 45m 2s
217 actionable tasks: 168 executed, 45 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cwwuudsk7ofie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #4647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4647/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4646/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13388] Update Cloud DLP after breaking changes. (#16236)

[noreply] [BEAM-13434] Bump google pubsublite on master. (#16265)


------------------------------------------
[...truncated 47.87 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 1 failed, 59 passed, 9 skipped, 171 warnings in 7945.93 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1684.87 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 59m 46s
217 actionable tasks: 176 executed, 37 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/d5uydegdlne2i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4645/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12931] Allow for DoFn#getAllowedTimestampSkew() when checking the

[noreply] [BEAM-13467] Properly handle null argument types for logical types.

[noreply] [BEAM-10277] Initial implementation for encoding position in Python

[noreply] [BEAM-11545] State & timer for batched RPC calls pattern (#13643)

[noreply] Automatically prune local images before building an RC. (#16238)

[noreply] Add verbose error messages to container-related scripts. (#16056)

[noreply] [BEAM-13456] Rollback #15890 to fix timeout in Java PostCommit (#16257)

[noreply] [BEAM-13015] Add a state backed iterable that can be mutated under


------------------------------------------
[...truncated 32.49 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:12, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: afb4031c31104f1f627d51ba4e2a597f, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:2, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c31e5e237d650b840ff19fec5000a516, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:10, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 0836983fbdd4d822e5551632df0c86e3, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:9, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 41aba9cd409b5802b999bbcfaba56d75, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:3, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 4c955e85a31414fc1d75b9feb1e1192f, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:15, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: f1a7a38b1c1166317a8acf638fa5103d, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:5, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: d82e66a3782c0c15c30868498d4a3d81, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job 0e4893da9dd618e5985a7423c148a190 with leader id 97f16e6aa1f14afee97505b03a82498c lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:7, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: cc76f1bdc115eb74487ea7b2efa5c81c, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-89cca495-91e6-439f-b98a-52dc451026f3'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-30645bc4-6639-4381-85c9-af71e032719e'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-9fc8951b-b440-404f-b702-1a9b9ca0a30c'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:40239'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 243.01 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639700489.050308639","description":"Error received from peer ipv4:127.0.0.1:40215","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639700551.147684363","description":"Error received from peer ipv4:127.0.0.1:37935","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1693.14 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 41m 55s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/zbeiyvoxt3geg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4644/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16061 from [BEAM-13428] [Playground] Integrate

[noreply] Clarify CoGroupByKey creates Iterable, not list. (#16099)


------------------------------------------
[...truncated 47.32 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 57 passed, 9 skipped, 175 warnings in 6582.51 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1725.95 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 24m 23s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/st5tnqbv27yra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4643/display/redirect>

Changes:


------------------------------------------
[...truncated 23.40 MB...]
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.293957948684692 seconds.
INFO:root:Successfully completed job in 5.293957948684692 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:38729
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f38656a5e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f38656a5ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f38656a3680> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxsprn5iu/artifactsi5w1dsba' '--job-port' '56079' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:36845'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:46553'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:56079'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:56079.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:20 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:20 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40989.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40205.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42171
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639656984.062141701","description":"Error received from peer ipv4:127.0.0.1:40989","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062102293","description":"Error received from peer ipv4:127.0.0.1:40205","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062091255","description":"Error received from peer ipv4:127.0.0.1:42171","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062091255","description":"Error received from peer ipv4:127.0.0.1:42171","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1762.54 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 24m 31s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/aml2qlu24kc3a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4642/display/redirect?page=changes>

Changes:

[thiagotnunes] [BEAM-12164] Add Spanner Change Stream DAOs

[noreply] [BEAM-13218] Sickbay

[noreply] [BEAM-13399] Add infrastructure to start JARs from Go functions (#16214)


------------------------------------------
[...truncated 45.28 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 2 failed, 58 passed, 9 skipped, 175 warnings in 6403.47 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1689.09 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 28m 47s
217 actionable tasks: 165 executed, 48 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ujpmftiam533g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4641/display/redirect?page=changes>

Changes:

[noreply] Update grafana from 8.1.6 to 8.1.8

[noreply] [BEAM-13015] Update FakeBeamFnStateClient to generate elements that stop


------------------------------------------
[...truncated 8.55 MB...]
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216003106-4a03e9b3_fc0db576-cc0d-44b0-bc95-583c2f78e205: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.19 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216003106-4a03e9b3_fc0db576-cc0d-44b0-bc95-583c2f78e205 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639614677.123967127","description":"Error received from peer ipv4:127.0.0.1:38341","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639614677.123967127","description":"Error received from peer ipv4:127.0.0.1:38341","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639614677.123981997","description":"Error received from peer ipv4:127.0.0.1:43847","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639614677.124000971","description":"Error received from peer ipv4:127.0.0.1:45839","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639614759.543300236","description":"Error received from peer ipv4:127.0.0.1:42921","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639614855.814318265","description":"Error received from peer ipv4:127.0.0.1:36539","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1703.82 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 50m 21s
217 actionable tasks: 163 executed, 50 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ut7q2ed6x4jro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 4640 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4640 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4640/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #4639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4639/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13209] Fix DynamoDBIO.write to properly handle partial success


------------------------------------------
[...truncated 48.15 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 181 warnings in 6730.00 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1680.28 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 23m 38s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7feq7ihym7pa2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org