You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/10/22 06:36:34 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Dataflow #1346

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1346/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Create a multiplexer that sends Elements based upon


------------------------------------------
[...truncated 458.69 KB...]
crcmod==1.7
Cython==0.29.21
dataclasses==0.8
dill==0.3.1.1
docopt==0.6.2
fastavro==1.0.0.post1
fasteners==0.16.3
flatbuffers==1.12
freezegun==0.3.15
future==0.18.2
gast==0.4.0
google-api-core==1.31.3
google-api-python-client==2.27.0
google-apitools==0.5.31
google-auth==1.31.0
google-auth-httplib2==0.1.0
google-auth-oauthlib==0.4.6
google-cloud-bigquery==1.26.1
google-cloud-bigquery-storage==2.9.1
google-cloud-bigtable==1.0.0
google-cloud-core==1.4.1
google-cloud-datastore==1.15.3
google-cloud-dlp==0.13.0
google-cloud-language==1.3.0
google-cloud-profiler==3.0.4
google-cloud-pubsub==1.0.2
google-cloud-recommendations-ai==0.2.0
google-cloud-spanner==1.13.0
google-cloud-videointelligence==1.13.0
google-cloud-vision==0.42.0
google-crc32c==1.3.0
google-pasta==0.2.0
google-python-cloud-debugger==2.15
google-resumable-media==1.3.3
googleapis-common-protos==1.53.0
grpc-google-iam-v1==0.12.3
grpcio==1.40.0
grpcio-gcp==0.2.2
guppy3==3.0.10
h5py==3.1.0
hdfs==2.5.8
httplib2==0.19.1
idna==2.10
importlib-metadata==4.8.1
joblib==1.1.0
keras==2.6.0
Keras-Preprocessing==1.1.2
libcst==0.3.21
Markdown==3.3.4
mmh3==2.5.1
more-itertools==8.10.0
mypy-extensions==0.4.3
nltk==3.5
numpy==1.19.5
oauth2client==4.1.3
oauthlib==3.1.1
opt-einsum==3.3.0
orjson==3.6.1
packaging==21.0
pandas==1.1.5
Pillow==7.2.0
pip==21.2.4
pluggy==0.13.1
proto-plus==1.19.5
protobuf==3.17.3
protorpc==0.12.0
PTable==0.9.2
py==1.10.0
pyarrow==3.0.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
pydot==1.4.1
PyHamcrest==1.10.1
pymongo==3.10.1
pyparsing==2.4.7
pytest==4.6.11
python-dateutil==2.8.1
python-gflags==3.1.2
python-snappy==0.5.4
pytz==2020.1
PyYAML==5.4
regex==2021.10.21
requests==2.24.0
requests-oauthlib==1.3.0
rsa==4.7.2
scikit-learn==0.24.1
scipy==1.4.1
setuptools==57.5.0
six==1.15.0
soupsieve==2.2.1
tenacity==8.0.1
tensorboard==2.7.0
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.0
tensorflow==2.6.0
tensorflow-estimator==2.6.0
termcolor==1.1.0
threadpoolctl==3.0.0
tqdm==4.62.3
typing-extensions==3.7.4.3
typing-inspect==0.7.1
uritemplate==4.1.1
urllib3==1.25.11
wcwidth==0.2.5
Werkzeug==2.0.2
wheel==0.37.0
wrapt==1.12.1
zipp==3.6.0
Removing intermediate container 0401ed16313a
 ---> 3b5d4950c9da
Step 22/27 : RUN pip check
 ---> Running in cadc8fb3509a
No broken requirements found.
Removing intermediate container cadc8fb3509a
 ---> 83ba85564938
Step 23/27 : COPY target/LICENSE /opt/apache/beam/
 ---> ef95de1f5738
Step 24/27 : COPY target/LICENSE.python /opt/apache/beam/
 ---> 155c110ad54d
Step 25/27 : COPY target/NOTICE /opt/apache/beam/
 ---> 2cb11cf0123e
Step 26/27 : ADD target/launcher/linux_amd64/boot /opt/apache/beam/
 ---> a44f2fe0ba3c
Step 27/27 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 2fc9b15ab0f5
Removing intermediate container 2fc9b15ab0f5
 ---> 86d6a059c0d7
Successfully built 86d6a059c0d7
Successfully tagged apache/beam_python3.6_sdk:2.35.0.dev

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerSetup
Launching Java expansion service @ 43033
Launching Python expansion service @ 36411

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --sdk_harness_container_image_overrides=.*java.*,us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022060108 --experiments=use_runner_v2 --experiments=shuffle_mode=appliance --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>   pytest options: --capture=no --numprocesses=8 --timeout=4500 --log-cli-level=INFO
>>>   collect markers: -m=xlang_transforms
============================= test session starts ==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw2] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.6.8 (default, Dec 24 2018, 19:24:27)  -- [GCC 5.4.0 20160609]
gw0 [10] / gw1 [10] / gw2 [10] / gw3 [10] / gw4 [10] / gw5 [10] / gw6 [10] / gw7 [10]

scheduling tests via LoadScheduling

apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_multi_input_output_with_sideinput 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_combine_globally 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_combine_per_key 
apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence_java_class_lookup_payload_builder 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_cogroup_by_key 
apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence_java_external_transform 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_group_by_key 
[gw6] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_group_by_key 
[gw2] PASSED apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_partition 
[gw4] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_combine_per_key 
[gw1] PASSED apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence_java_external_transform 
[gw3] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_combine_globally 
[gw7] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_multi_input_output_with_sideinput 
[gw0] PASSED apache_beam/io/external/generate_sequence_test.py::XlangGenerateSequenceTest::test_generate_sequence_java_class_lookup_payload_builder 
apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_prefix 
[gw5] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_cogroup_by_key 
[gw2] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_partition 
[gw0] PASSED apache_beam/transforms/validate_runner_xlang_test.py::ValidateRunnerXlangTest::test_prefix 

=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
================== 10 passed, 14 warnings in 1239.69 seconds ===================

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 11228.
Stopping expansion service pid: 11231.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022060108
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:983c8fb8eae123cc4902937cfde88cbc43a4ad2ff28985f6afcee9fc5016aaaa
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022060108]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:983c8fb8eae123cc4902937cfde88cbc43a4ad2ff28985f6afcee9fc5016aaaa]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022060108] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:983c8fb8eae123cc4902937cfde88cbc43a4ad2ff28985f6afcee9fc5016aaaa])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:46f9964fe136d13b950183e853138e5b26b0cdb9f38b060297321a0efa094d83
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:46f9964fe136d13b950183e853138e5b26b0cdb9f38b060297321a0efa094d83
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 'date': 'Fri, 22 Oct 2021 06:36:31 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 'sha256:46f9964fe136d13b950183e853138e5b26b0cdb9f38b060297321a0efa094d83': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 282

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35m 59s
132 actionable tasks: 95 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xtk6z5nlebr4e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Dataflow #1348

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1348/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Dataflow #1347

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1347/display/redirect>

Changes:


------------------------------------------
[...truncated 728.17 KB...]

self = <apache_beam.transforms.validate_runner_xlang_test.ValidateRunnerXlangTest testMethod=test_prefix>
test_pipeline = None

    @pytest.mark.xlang_transforms
    def test_prefix(self, test_pipeline=None):
      CrossLanguageTestPipelines().run_prefix(
>         test_pipeline or self.create_pipeline())

apache_beam/transforms/validate_runner_xlang_test.py:254: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/transforms/validate_runner_xlang_test.py:99: in run_prefix
    assert_that(res, equal_to(['0a', '0b']))
apache_beam/pipeline.py:596: in __exit__
    self.result = self.run()
apache_beam/testing/test_pipeline.py:114: in run
    False if self.not_use_test_runner_api else test_runner_api))
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:64: in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20211022122056190981-3474'
 createTime: '2021-10-22T12:21:01.981955Z'
...021-10-22T12:21:01.981955Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f3879d15668>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely.')
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1643: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:303 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.35.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211015
INFO     root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211015" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function pack_combiners at 0x7f387c039598> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sort_stages at 0x7f387c039d08> ====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:454 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/beam-sdks-java-testing-expansion-service-testExpansionService-2.35.0-SNAPSHOT-chW-Opb9iI4d_uet_1t9qGaU5878hJpS-348YJlZVEo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/beam-sdks-java-testing-expansion-service-testExpansionService-2.35.0-SNAPSHOT-chW-Opb9iI4d_uet_1t9qGaU5878hJpS-348YJlZVEo.jar in 4 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:638 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:657 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1022122056-189594.1634905256.190092/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:818 Create job: <Job
                                                                           clientRequestId: '20211022122056190981-3474'
                                                                           createTime: '2021-10-22T12:21:01.981955Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2021-10-22_05_21_01-3974354247118139594'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-1022122056-189594'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2021-10-22T12:21:01.981955Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:820 Created job with id: [2021-10-22_05_21_01-3974354247118139594]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:821 Submitted job: 2021-10-22_05_21_01-3974354247118139594
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:827 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-22_05_21_01-3974354247118139594?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-10-22_05_21_01-3974354247118139594 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:04.684Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-10-22_05_21_01-3974354247118139594. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:04.730Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-10-22_05_21_01-3974354247118139594.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:06.581Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.387Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.421Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.506Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.534Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.557Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.584Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.611Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.656Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.689Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.713Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.733Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_27 for input ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_25.None-post14
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.759Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten, into producer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.779Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.809Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/RestoreTags into assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.845Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.887Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.921Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:3222>) into Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:07.986Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:3222>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.109Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.134Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.162Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.198Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.258Z: JOB_MESSAGE_DETAILED: Fusing consumer ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous) into Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.290Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.321Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:3222>) into assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.353Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:3222>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.388Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.414Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.447Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.501Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.524Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.550Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.585Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.621Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.660Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.779Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.831Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.856Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.869Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.903Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.954Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:08.967Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:09.003Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:09.026Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:09.060Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3222>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:09.093Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3222>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:33.655Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:21:43.789Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:22:34.338Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:23:01.913Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:23:44.459Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:24:11.958Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:24:53.958Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:25:20.142Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:26:02.952Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:26:29.661Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.695Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.727Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113 not found: manifest unknown: Failed to fetch "20211022120113" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211022120113".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.794Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3222>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.794Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3222>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.866Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.938Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:11.960Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:12.342Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:59.303Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-10-22T12:27:59.333Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-10-22_05_21_01-3974354247118139594 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
=============== 9 failed, 1 passed, 4 warnings in 972.24 seconds ===============

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava FAILED

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 10933.
Stopping expansion service pid: 10936.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5976ba21d43c835ecaa869e505071ba1a5ebfc844e6a8a6e897e7f4f1f6cf717
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211022120113]

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 279

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 28m 9s
132 actionable tasks: 94 executed, 34 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/rpummak6puobc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org