You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/12/11 14:17:57 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #4623

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4623/display/redirect>

Changes:


------------------------------------------
[...truncated 55.22 MB...]

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 55 passed, 11 skipped, 182 warnings in 6381.52 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1634.26 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 17m 24s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/mclrm5ku6w2na

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #4647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4647/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4646/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13388] Update Cloud DLP after breaking changes. (#16236)

[noreply] [BEAM-13434] Bump google pubsublite on master. (#16265)


------------------------------------------
[...truncated 47.87 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 1 failed, 59 passed, 9 skipped, 171 warnings in 7945.93 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1684.87 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 59m 46s
217 actionable tasks: 176 executed, 37 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/d5uydegdlne2i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4645/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12931] Allow for DoFn#getAllowedTimestampSkew() when checking the

[noreply] [BEAM-13467] Properly handle null argument types for logical types.

[noreply] [BEAM-10277] Initial implementation for encoding position in Python

[noreply] [BEAM-11545] State & timer for batched RPC calls pattern (#13643)

[noreply] Automatically prune local images before building an RC. (#16238)

[noreply] Add verbose error messages to container-related scripts. (#16056)

[noreply] [BEAM-13456] Rollback #15890 to fix timeout in Java PostCommit (#16257)

[noreply] [BEAM-13015] Add a state backed iterable that can be mutated under


------------------------------------------
[...truncated 32.49 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:12, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: afb4031c31104f1f627d51ba4e2a597f, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:2, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c31e5e237d650b840ff19fec5000a516, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:10, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 0836983fbdd4d822e5551632df0c86e3, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:9, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 41aba9cd409b5802b999bbcfaba56d75, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:3, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 4c955e85a31414fc1d75b9feb1e1192f, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:15, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: f1a7a38b1c1166317a8acf638fa5103d, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:5, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: d82e66a3782c0c15c30868498d4a3d81, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job 0e4893da9dd618e5985a7423c148a190 with leader id 97f16e6aa1f14afee97505b03a82498c lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:7, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: cc76f1bdc115eb74487ea7b2efa5c81c, jobId: 0e4893da9dd618e5985a7423c148a190).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-89cca495-91e6-439f-b98a-52dc451026f3'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-30645bc4-6639-4381-85c9-af71e032719e'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-9fc8951b-b440-404f-b702-1a9b9ca0a30c'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:40239'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 17, 2021 12:24:44 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 243.01 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639700489.050308639","description":"Error received from peer ipv4:127.0.0.1:40215","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639700551.147684363","description":"Error received from peer ipv4:127.0.0.1:37935","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1693.14 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 41m 55s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/zbeiyvoxt3geg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4644/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16061 from [BEAM-13428] [Playground] Integrate

[noreply] Clarify CoGroupByKey creates Iterable, not list. (#16099)


------------------------------------------
[...truncated 47.32 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 57 passed, 9 skipped, 175 warnings in 6582.51 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1725.95 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 24m 23s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/st5tnqbv27yra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4643/display/redirect>

Changes:


------------------------------------------
[...truncated 23.40 MB...]
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.293957948684692 seconds.
INFO:root:Successfully completed job in 5.293957948684692 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:38729
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f38656a5e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f38656a5ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f38656a3680> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempxsprn5iu/artifactsi5w1dsba' '--job-port' '56079' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:36845'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:46553'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:56079'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:56079.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_4ae7d275-9ca8-4d0d-bb92-26cbffcf78ba.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:19 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:20 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:20 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:40989.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40205.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:42171
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 12:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216121619-10470209_a6dc74ea-91bd-4e96-864f-eeebd5a68ab5 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639656984.062141701","description":"Error received from peer ipv4:127.0.0.1:40989","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062102293","description":"Error received from peer ipv4:127.0.0.1:40205","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062091255","description":"Error received from peer ipv4:127.0.0.1:42171","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639656984.062091255","description":"Error received from peer ipv4:127.0.0.1:42171","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1762.54 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 24m 31s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/aml2qlu24kc3a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4642/display/redirect?page=changes>

Changes:

[thiagotnunes] [BEAM-12164] Add Spanner Change Stream DAOs

[noreply] [BEAM-13218] Sickbay

[noreply] [BEAM-13399] Add infrastructure to start JARs from Go functions (#16214)


------------------------------------------
[...truncated 45.28 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 2 failed, 58 passed, 9 skipped, 175 warnings in 6403.47 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1689.09 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 28m 47s
217 actionable tasks: 165 executed, 48 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ujpmftiam533g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4641/display/redirect?page=changes>

Changes:

[noreply] Update grafana from 8.1.6 to 8.1.8

[noreply] [BEAM-13015] Update FakeBeamFnStateClient to generate elements that stop


------------------------------------------
[...truncated 8.55 MB...]
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216003106-4a03e9b3_fc0db576-cc0d-44b0-bc95-583c2f78e205: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.19 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/16 00:31:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1216003106-4a03e9b3_fc0db576-cc0d-44b0-bc95-583c2f78e205 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639614677.123967127","description":"Error received from peer ipv4:127.0.0.1:38341","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1639614677.123967127","description":"Error received from peer ipv4:127.0.0.1:38341","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639614677.123981997","description":"Error received from peer ipv4:127.0.0.1:43847","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639614677.124000971","description":"Error received from peer ipv4:127.0.0.1:45839","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639614759.543300236","description":"Error received from peer ipv4:127.0.0.1:42921","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639614855.814318265","description":"Error received from peer ipv4:127.0.0.1:36539","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1703.82 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 50m 21s
217 actionable tasks: 163 executed, 50 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ut7q2ed6x4jro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 4640 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4640 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4640/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #4639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4639/display/redirect?page=changes>

Changes:

[mmack] [BEAM-13209] Fix DynamoDBIO.write to properly handle partial success


------------------------------------------
[...truncated 48.15 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 181 warnings in 6730.00 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1680.28 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 23m 38s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7feq7ihym7pa2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4638

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4638/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13434] Bump log4j to 2.16.0. (#16237)


------------------------------------------
[...truncated 34.41 MB...]
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
apache_beam/io/gcp/bigquery.py:2453
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2453: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
apache_beam/io/gcp/bigquery.py:2123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
apache_beam/io/gcp/bigquery_file_loads.py:1129
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1129: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
apache_beam/io/gcp/bigquery_file_loads.py:1131
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1131: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2594
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2594: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2595
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2595: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2608
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2608: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 4 failed, 57 passed, 8 skipped, 173 warnings in 7921.18 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1704.81 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 45m 2s
217 actionable tasks: 168 executed, 45 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cwwuudsk7ofie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4637

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4637/display/redirect?page=changes>

Changes:

[noreply] Removes the comment that seems no longer relevant.

[Valentyn Tymofieiev] Update a few dependencies that may depend on log4j transitively.

[noreply] [BEAM-13355] add Big Query parameter to enable users to specify load_…

[noreply] Updated the base images to use debian:bullseye (#16221)


------------------------------------------
[...truncated 30.68 MB...]
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215001803-d82c6ea0_10378d7f-6083-4680-9289-888af8584252: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/15 00:18:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1215001803-d82c6ea0_10378d7f-6083-4680-9289-888af8584252 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639527493.169347871","description":"Error received from peer ipv4:127.0.0.1:44307","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639527493.169347871","description":"Error received from peer ipv4:127.0.0.1:44307","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639527493.169860460","description":"Error received from peer ipv4:127.0.0.1:40133","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639527493.170029536","description":"Error received from peer ipv4:127.0.0.1:35365","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639527721.644773162","description":"Error received from peer ipv4:127.0.0.1:42389","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639527751.570726778","description":"Error received from peer ipv4:127.0.0.1:45963","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1752.30 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 46m 11s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/gn6di326axcli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4636

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4636/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13438][Playground]

[stranniknm] [BEAM-13446]: fix incorrect tab symbol

[noreply] [BEAM-13159] Add Redis Stream (XADD) Write Support (#15858)

[noreply] Merge pull request #15994: [BEAM-13263] Support OnWindowExpiration in

[noreply] Bump dataflow container version to beam-master-20211213 (#16213)

[noreply] Merge pull request #16198 from [BEAM-13437][Playground] Add

[noreply] Merge pull request #16195 from a[BEAM-13436][Playground] Add


------------------------------------------
[...truncated 45.62 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [4df19ffc8664b9cf4ae122e9bc3c1272].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [a9300f297356667f90502a038cfe88db].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [7a797738ed8116dd5aab3b27bc25edb3].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [3f09d82fffa8df2414bc238ed1568da4].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [823f80711689ba1707dfffa9b90f9122].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [460b6b89658a57c35a016807a2f5ffd4].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [1c3b44d1a3ceb0fac919dfc1dcc36468].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [49dc243989f8d1bcf2f9bfa626c233f2].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [b69f4aa6cc5793ae4311a68bd15506ff].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [8b9c2df131b69f39dcd5360b9148f145].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [2ea66d4585342f297ecc93a9dab7b7bf].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [dc7672ab83159c610b6a4a2ceed943a2].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [99afa60fe2009ce89fda92875e7543d8].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [d6b1d3a9821dd64854a6f8c8b9cd9d75].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [ca6d94231158e7091a0aecc2d589bea2].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool releaseSlots'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Releasing slot [bf553e0d3ab42264bac008935a44adc0].'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager processResourceRequirements'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Clearing resource requirements of job fb1ed22b4432f93828c3d3c5bedd859d'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.jobmaster.JobMaster dissolveResourceManagerConnection'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Close ResourceManager connection c011cbb309886822a237d8df43353b7a: Stopping JobMaster for job BeamApp-jenkins-1214183617-2f7b9ede(fb1ed22b4432f93828c3d3c5bedd859d)..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:22 PM org.apache.flink.runtime.resourcemanager.ResourceManager closeJobManagerConnection'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Disconnect job manager 8452c52ec8af076817c9973d4eed42cc@akka://flink/user/rpc/jobmanager_3 for job fb1ed22b4432f93828c3d3c5bedd859d from the resource manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:1, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 7a797738ed8116dd5aab3b27bc25edb3, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:12, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 3f09d82fffa8df2414bc238ed1568da4, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job fb1ed22b4432f93828c3d3c5bedd859d with leader id 8452c52ec8af076817c9973d4eed42cc lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:3, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 823f80711689ba1707dfffa9b90f9122, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:14, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 460b6b89658a57c35a016807a2f5ffd4, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:2, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 1c3b44d1a3ceb0fac919dfc1dcc36468, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:7, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 49dc243989f8d1bcf2f9bfa626c233f2, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:13, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: b69f4aa6cc5793ae4311a68bd15506ff, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:4, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8b9c2df131b69f39dcd5360b9148f145, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:8, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 2ea66d4585342f297ecc93a9dab7b7bf, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:15, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: dc7672ab83159c610b6a4a2ceed943a2, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:11, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 99afa60fe2009ce89fda92875e7543d8, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:9, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: d6b1d3a9821dd64854a6f8c8b9cd9d75, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:6, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: ca6d94231158e7091a0aecc2d589bea2, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:5, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: bf553e0d3ab42264bac008935a44adc0, jobId: fb1ed22b4432f93828c3d3c5bedd859d).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-fa10cf1b-7636-46cf-ab93-307289d38d50'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-9599cee9-2743-4838-a9a3-ea077f315989'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-e1365b64-5bbe-4c46-a06f-f0786835c57e'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
PASSED                                                                   [100%]
------------------------------ live log teardown -------------------------------
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:42463'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 14, 2021 6:37:23 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'


- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 515.20 seconds ==========================

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw2] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1734.81 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 50m 17s
217 actionable tasks: 162 executed, 51 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (1 retry remaining)...

A network error occurred.

If you require assistance with this problem, please report it via https://gradle.com/help/plugin and include the following information via copy/paste.

----------
Gradle version: 6.9.1
Plugin version: 3.4.1
Request URL: https://scans-in.gradle.com/scans/publish/gradle/3.4.1/token
Request ID: 68c28eba-68b2-4eea-919a-7a62c9cab7ee
Exception: java.net.SocketTimeoutException: Read timed out
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4635

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4635/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Allow decoding a set of elements until we hit the block


------------------------------------------
[...truncated 314.42 KB...]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [30] / gw1 [30] / gw2 [30] / gw3 [30] / gw4 [30] / gw5 [30] / gw6 [30] / gw7 [30]

scheduling tests via LoadScheduling

apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_native 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql 
apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it 
apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes 
> Task :runners:flink:1.13:job-server:shadowJar

> Task :runners:google-cloud-dataflow-java:worker:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:google-cloud-dataflow-java:worker:classes

> Task :sdks:python:test-suites:direct:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --numprocesses=8 --timeout=4500 --color=yes --log-cli-level=INFO apache_beam/io/gcp/experimental/spannerio_read_it_test.py apache_beam/io/gcp/experimental/spannerio_write_it_test.py
>>>   collect markers: 
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw5] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw2] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
> Task :runners:google-cloud-dataflow-java:worker:shadowJar

> Task :sdks:python:test-suites:direct:py37:spannerioIT

[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw6] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:39995
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f827c060ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f827c060f80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f827c05e710> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempc7n4e8nw/artifactskd9dslrc' '--job-port' '48459' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:46511'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44397'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48459'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
WARNING:root:Waiting for grpc channel to be ready at localhost:48459.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:27 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:27 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:29 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:30 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46871.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44635.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:44501
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.16 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639484321.789422507","description":"Error received from peer ipv4:127.0.0.1:44501","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639484321.789422507","description":"Error received from peer ipv4:127.0.0.1:44501","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639484321.790495211","description":"Error received from peer ipv4:127.0.0.1:46871","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639484321.790462071","description":"Error received from peer ipv4:127.0.0.1:44635","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37IT

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

warning: check: missing required meta-data: url

warning: check: missing meta-data: either (author and author_email) or (maintainer and maintainer_email) must be supplied


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 50m 5s
217 actionable tasks: 179 executed, 34 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/pydxanh7uo63a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4634

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4634/display/redirect?page=changes>

Changes:

[Daniel Oliveira] [BEAM-13321] Pass TempLocation as pipeline option to Dataflow Go for

[noreply] [BEAM-12976] Pipeline visitor to discover pushdown opportunities.


------------------------------------------
[...truncated 20.02 MB...]
INFO:root:Successfully completed job in 26.470600605010986 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:33469
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fd2b16ecef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fd2b16ecf80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fd2b16ea710> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempg4keizay/artifacts1a0cw7ar' '--job-port' '33313' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:33313.
WARNING:root:Waiting for grpc channel to be ready at localhost:33313.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:23 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:37623'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:24 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:46813'
WARNING:root:Waiting for grpc channel to be ready at localhost:33313.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:24 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:33313'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:24 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:33313.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:27 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_0ef6377e-ae40-4154-a6d6-cea933e27a94.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:27 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_0ef6377e-ae40-4154-a6d6-cea933e27a94.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:27 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_0ef6377e-ae40-4154-a6d6-cea933e27a94.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:27 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_0ef6377e-ae40-4154-a6d6-cea933e27a94.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:28 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214063828-c710b88e_0df06838-87cf-4d31-97a1-735226b2192d'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:28 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214063828-c710b88e_0df06838-87cf-4d31-97a1-735226b2192d'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:28 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:30 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:33 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:33 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214063828-c710b88e_0df06838-87cf-4d31-97a1-735226b2192d on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:33621.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44309.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:40929
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214063828-c710b88e_0df06838-87cf-4d31-97a1-735226b2192d: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.14 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/14 06:38:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214063828-c710b88e_0df06838-87cf-4d31-97a1-735226b2192d finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639463917.444714385","description":"Error received from peer ipv4:127.0.0.1:33621","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639463917.444694426","description":"Error received from peer ipv4:127.0.0.1:44309","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639463917.444680195","description":"Error received from peer ipv4:127.0.0.1:40929","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639463917.444680195","description":"Error received from peer ipv4:127.0.0.1:40929","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639464004.141837016","description":"Error received from peer ipv4:127.0.0.1:36261","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639464129.060884609","description":"Error received from peer ipv4:127.0.0.1:34803","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639464170.649346979","description":"Error received from peer ipv4:127.0.0.1:33473","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639464255.181906870","description":"Error received from peer ipv4:127.0.0.1:38439","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 32m 32s
217 actionable tasks: 182 executed, 31 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/aluss5zqe66zg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4633

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4633/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Better type hints for Count combiners.

[Kyle Weaver] Include name of missing tag in error message.

[noreply] Updating Grafana from v8.1.2 to v8.1.6

[noreply] Change Pub/Sub Lite PollResult to set explicit watermark (#16216)

[noreply] [BEAM-13454] Fix and test dataframe read_fwf. (#16064)


------------------------------------------
[...truncated 48.10 MB...]
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2447: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
apache_beam/dataframe/io.py:632
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 173 warnings in 7556.07 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1710.03 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 44m 0s
217 actionable tasks: 166 executed, 47 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4arecmico5fii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4632

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4632/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-13423]: fix frontend failure if no examples

[daria.malkova] change return type of 2 methods

[mmack] [BEAM-13441] Use quiet delete for S3 batch deletes. In quiet mode only

[daria.malkova] Docs for validators tests

[daria.malkova] change context type

[noreply] Merge pull request #16140 from [BEAM-13377][Playground] Update CI/CD

[noreply] Merge pull request #16120 from [BEAM-13333][Playground] Save Python logs

[noreply] Merge pull request #16185 from [BEAM-13425][Playground][Bugfix] Support

[mmack] [BEAM-13445] Correctly set data limit when flushing S3 upload buffer and

[noreply] Merge pull request #16121 from [BEAM-13334][Playground] Save Go logs to

[noreply] Merge pull request #16179 from [BEAM-13344][Playground] support python

[noreply] Merge pull request #16208 from [BEAM-13442][Playground] Filepath to log

[noreply] [BEAM-13276] bump jackson-core to 2.13.0 for .test-infra (#16062)


------------------------------------------
[...truncated 18.46 MB...]
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:01 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:01 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1213182954-af9b0a62_f4828ebf-0f9a-446b-a799-31484f8a7432 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:44423.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:46117.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35493
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213182954-af9b0a62_f4828ebf-0f9a-446b-a799-31484f8a7432: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.14 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:30:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1213182954-af9b0a62_f4828ebf-0f9a-446b-a799-31484f8a7432 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639420206.023490810","description":"Error received from peer ipv4:127.0.0.1:35493","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639420206.023490810","description":"Error received from peer ipv4:127.0.0.1:35493","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639420206.023514889","description":"Error received from peer ipv4:127.0.0.1:44423","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639420206.023514885","description":"Error received from peer ipv4:127.0.0.1:46117","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639420276.037435202","description":"Error received from peer ipv4:127.0.0.1:36319","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1589.79 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 27m 18s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/y4ltoxqgl3zmc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4631/display/redirect>

Changes:


------------------------------------------
[...truncated 25.93 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac0TzRPNEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:9, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 990c87464636cfcc97285d991c6101ae, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:0, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 2e04e9e3f25c30c93fe6b4ba283e8420, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:7, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 839341210faf8ab32af1bd0273ddcdd0, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:1, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 046611c69ef77929e3a8d9df643aeaa2, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:13, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: a313e50b4f321876705bf4efc17f4925, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job a01dba08f0a96c57dcd2ae1441667ae1 with leader id 9dc480e17634f62cca1fcc0240ec4a37 lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:8, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 11c9fae83a011da2abd96c708af2914a, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:5, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: bd35fd290a7917aba557a6fabd29d75e, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:10, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: b55b024bdd33e938984a390fbfa37005, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:15, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c1720ff62b72ee9149a9c60bec2e1df2, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:2, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 36b8fe907d72eb995ec538a8b5e34ab0, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:4, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: db09496cf863dafe6747c95942abdb90, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:6, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 44a44d023fb01f9e29b727daa3410db9, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:14, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 340a952c6d43a4de11a781cd0d7a1936, jobId: a01dba08f0a96c57dcd2ae1441667ae1).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-8a94d1b5-1885-4ab1-934d-cb348ecbb3db'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-89c4eef0-5f20-4f37-903d-173a39067ae2'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-7a23a0d2-0d1b-4d10-9346-0772b1a4c188'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:40345'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 13, 2021 12:23:58 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 280.90 seconds ==========================

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1561.49 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 18m 58s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/eqcucbyix3i7w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4630

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4630/display/redirect>

Changes:


------------------------------------------
[...truncated 48.04 MB...]
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2447: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 175 warnings in 7445.64 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1576.45 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 34m 17s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cfssplp75mnqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4629

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4629/display/redirect>

Changes:


------------------------------------------
[...truncated 48.18 MB...]
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2447: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 171 warnings in 6226.37 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1677.88 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 15m 15s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/kdcxpw472isne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4628

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4628/display/redirect>

Changes:


------------------------------------------
[...truncated 27.50 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-5d2a6781-af94-4195-8c7f-f1c8df31cee4'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-5947b7a1-a270-4eaf-a401-daa91ac93345'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-2f8011e4-fa1d-4ee7-94be-e7c2ca5e7237'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:38715'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:21:25 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 295.52 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639333146.116226382","description":"Error received from peer ipv4:127.0.0.1:46243","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639333189.710699068","description":"Error received from peer ipv4:127.0.0.1:46651","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639333284.954058156","description":"Error received from peer ipv4:127.0.0.1:43287","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1668.66 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 35m 37s
217 actionable tasks: 159 executed, 54 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cz7ynmrip73z4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4627

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4627/display/redirect>

Changes:


------------------------------------------
[...truncated 48.18 MB...]
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2447: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 173 warnings in 7601.53 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1660.97 seconds ==============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 40m 39s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/cmnqc6z3p4le6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4626

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4626/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Plumb the cache through contexts and transform executors.


------------------------------------------
[...truncated 41.47 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "ref_AppliedPTransform_Generate_3"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:process_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/read/ref_PCollection_PCollection_2:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:ptransform_execution_time:total_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "ref_AppliedPTransform_Map-to-row_4"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:start_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/write/ref_PCollection_PCollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:process_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/write/ref_PCollection_PCollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:finish_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/write/ref_PCollection_PCollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:ptransform_execution_time:total_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/write/ref_PCollection_PCollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:finish_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "ref_AppliedPTransform_Generate_3"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:pardo_execution_time:process_bundle_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "ref_AppliedPTransform_Generate_3"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:data_channel:read_index:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/read/ref_PCollection_PCollection_2:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:ptransform_execution_time:total_msecs:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/read/ref_PCollection_PCollection_2:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ag==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ag==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_3"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ah4PDw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_3"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ0NDQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AjAYGA==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "ref_PCollection_PCollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:data_channel:read_index:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/read/pcollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac0TzRPNEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-0c2331f0-ba0a-4dfb-928a-bc7bc6ce5197'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-c7428121-c10e-451d-ba2c-19182fea4a7b'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-e7c30972-52d6-4c1f-8f8e-56320346930e'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:37849'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 12, 2021 6:17:42 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 226.80 seconds ==========================

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 15m 52s
217 actionable tasks: 155 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/rvqyjgblx5zw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4625

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4625/display/redirect>

Changes:


------------------------------------------
[...truncated 51.28 MB...]

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 183 warnings in 6339.81 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1602.56 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 16m 10s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qx6mgwujkhctu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4624

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4624/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12683]  Fix failing integration tests for Python Recommendation AI


------------------------------------------
[...truncated 54.46 MB...]
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 58 passed, 8 skipped, 169 warnings in 7512.65 seconds ========

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 5 warnings in 1578.27 seconds ==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 37m 2s
217 actionable tasks: 153 executed, 60 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/rngoyb5uh7eq6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org