You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/11/16 04:02:39 UTC
Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #10085
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/10085/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Fix arguments to checkState in BatchViewOverrides
[noreply] Add error reporting for BatchConverter match failure (#24022)
[noreply] Update automation to use Go 1.19 (#24175)
[noreply] Fix broken json for notebook (#24183)
[noreply] Using Teardown context instead of deprecated finalize (#24180)
[noreply] [Python]Support pipe operator as Union (PEP -604) (#24106)
[noreply] Updated README of Interactive Beam
[noreply] Minor update
[noreply] Add custom inference function support to the PyTorch model handler
[noreply] Strip FGAC database role from changestreams metadata requests (#24177)
------------------------------------------
[...truncated 174.76 KB...]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116034846-254679-nthvafm2.1668570526.255016/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116034846-254679-nthvafm2.1668570526.255016/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116034846-254679-nthvafm2.1668570526.255016/dataflow-worker.jar in 5 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116034846-254679-nthvafm2.1668570526.255016/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116034846-254679-nthvafm2.1668570526.255016/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20221116034846257091-2535'
createTime: '2022-11-16T03:48:56.638050Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-11-15_19_48_56-1205098966693306554'
location: 'us-central1'
name: 'beamapp-jenkins-1116034846-254679-nthvafm2'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-11-16T03:48:56.638050Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-11-15_19_48_56-1205098966693306554]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-11-15_19_48_56-1205098966693306554
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:915 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_48_56-1205098966693306554?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_48_56-1205098966693306554?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_48_56-1205098966693306554?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-15_19_48_56-1205098966693306554 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:48:57.112Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:00.487Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-11-15_19_48_56-1205098966693306554. The number of workers will be between 1 and 100.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:00.517Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-11-15_19_48_56-1205098966693306554.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:10.575Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.365Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.385Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.451Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.492Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.521Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.546Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.579Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.627Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.652Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.672Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/ToProtobuf into generate_metrics
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.698Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write into dump_to_pub/ToProtobuf
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.739Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.765Z: JOB_MESSAGE_BASIC: Using cloud KMLS key to protect persistent state.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.866Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.896Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.926Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:13.966Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:15.047Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:15.073Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:15.124Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:38.263Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:49:58.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:254 Timing out on waiting for job 2022-11-15_19_48_56-1205098966693306554 after 60 seconds
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
[gw2] [32mPASSED[0m apache_beam/transforms/util_test.py::ReshuffleTest::test_reshuffle_preserves_timestamps
[33m=============================== warnings summary ===============================[0m
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
from imp import load_source
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-xdist.xml> -
[33m============ [32m30 passed[0m, [33m[1m8 skipped[0m, [33m[1m9 warnings[0m[33m in 3047.13s (0:50:47)[0m[33m ============[0m
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.44.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO
>>> collect markers: -m=it_validatesrunner and not no_sickbay_streaming and no_xdist
[1m============================= test session starts ==============================[0m
platform linux -- Python 3.7.12, pytest-7.2.0, pluggy-1.0.0
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python,> configfile: pytest.ini
plugins: xdist-2.5.0, timeout-2.1.0, hypothesis-6.57.1, forked-1.4.0, requests-mock-1.10.0
timeout: 4500.0s
timeout method: signal
timeout func_only: False
[1m----------------------------- live log collection ------------------------------[0m
[33mWARNING [0m apache_beam.runners.interactive.interactive_environment:interactive_environment.py:191 Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
[33mWARNING [0m apache_beam.runners.interactive.interactive_environment:interactive_environment.py:200 You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal.
[33mWARNING [0m root:avroio_test.py:54 python-snappy is not installed; some tests will be skipped.
[33mWARNING [0m root:tfrecordio_test.py:55 Tensorflow is not installed, so skipping some tests.
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.44.0.dev
collected 6738 items / 6737 deselected / 5 skipped / 1 selected
apache_beam/runners/dataflow/dataflow_exercise_streaming_metrics_pipeline_test.py::ExerciseStreamingMetricsPipelineTest::test_streaming_pipeline_returns_expected_user_metrics_fnapi_it
[1m-------------------------------- live log call ---------------------------------[0m
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:780 Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp1dkaahjn/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp37m', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:485 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.7_sdk:2.44.0.dev
[32mINFO [0m root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20221021
[32mINFO [0m root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20221021" for Docker environment
[32mINFO [0m apache_beam.internal.gcp.auth:auth.py:130 Setting socket default timeout to 60 seconds.
[32mINFO [0m apache_beam.internal.gcp.auth:auth.py:133 socket default timeout is 60.0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/requirements.txt...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/requirements.txt in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/pickled_main_session...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/pickled_main_session in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/mock-2.0.0-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/seaborn-0.12.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/seaborn-0.12.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/beautifulsoup4-4.11.1-py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/matplotlib-3.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/dataflow_python_sdk.tar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/dataflow_python_sdk.tar in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/dataflow-worker.jar...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/dataflow-worker.jar in 6 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/pipeline.pb...
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:751 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1116035113-267840-nthvafm2.1668570673.268180/pipeline.pb in 0 seconds.
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job
clientRequestId: '20221116035113269982-2535'
createTime: '2022-11-16T03:51:24.283586Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2022-11-15_19_51_23-7445149226609853949'
location: 'us-central1'
name: 'beamapp-jenkins-1116035113-267840-nthvafm2'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-11-16T03:51:24.283586Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-11-15_19_51_23-7445149226609853949]
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-11-15_19_51_23-7445149226609853949
[32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:920 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_51_23-7445149226609853949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_51_23-7445149226609853949?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log:
[32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_19_51_23-7445149226609853949?project=apache-beam-testing
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-15_19_51_23-7445149226609853949 is in state JOB_STATE_RUNNING
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:24.844Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:25.086Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-11-15_19_51_23-7445149226609853949. The number of workers will be between 1 and 100.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:25.122Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-11-15_19_51_23-7445149226609853949.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:32.870Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.138Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.173Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.232Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.271Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.302Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.327Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.352Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.392Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.420Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.454Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/ToProtobuf into generate_metrics
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.483Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write into dump_to_pub/ToProtobuf
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.514Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.548Z: JOB_MESSAGE_BASIC: Using cloud KMLS key to protect persistent state.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.634Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.662Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.720Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:36.746Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:37.799Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:37.828Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:37.866Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:51:57.619Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-16T03:52:20.725Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
[33mWARNING [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:257 Timing out on waiting for job 2022-11-15_19_51_23-7445149226609853949 after 60 seconds
> Task :sdks:python:test-suites:dataflow:py39:validatesRunnerStreamingTests
[32mPASSED[0m
[33m=============================== warnings summary ===============================[0m
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
from imp import load_source
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/typehints/pandas_type_compatibility_test.py:67
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
}).set_index(pd.Int64Index(range(123, 223), name='an_index')),
apache_beam/typehints/pandas_type_compatibility_test.py:90
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(123, 223), name='an_index'),
apache_beam/typehints/pandas_type_compatibility_test.py:91
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(475, 575), name='another_index'),
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py39-noxdist.xml> -
[33m==== [32m1 passed[0m, [33m[1m5 skipped[0m, [33m[1m6737 deselected[0m, [33m[1m5 warnings[0m[33m in 588.20s (0:09:48)[0m[33m =====[0m
> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
[32mPASSED[0m
[33m=============================== warnings summary ===============================[0m
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
from imp import load_source
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py37-noxdist.xml> -
[33m===== [32m1 passed[0m, [33m[1m5 skipped[0m, [33m[1m6737 deselected[0m, [33m[1m1 warning[0m[33m in 695.16s (0:11:35)[0m[33m =====[0m
> Task :sdks:python:test-suites:dataflow:validatesRunnerStreamingTests
FAILURE: Build failed with an exception.
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 236
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 43m
97 actionable tasks: 64 executed, 31 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/r2j72zpjpx2ro
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #10086
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/10086/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org