You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/01/12 06:32:54 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Dataflow #1674

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1674/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13616] Initial files for vendored gRPC 1.43.2 (#16460)


------------------------------------------
[...truncated 730.05 KB...]
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely.')
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1639: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:303 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.37.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211222
INFO     root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211222" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function pack_combiners at 0x7f223357f1e0> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sort_stages at 0x7f223357f950> ====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:456 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/beam-sdks-java-testing-expansion-service-testExpansionService-2.37.0-SNAPSHOT-omaQVwNVWA6_rt0sc6DjGQpPAfmYvIvrBf17FcJipHE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/beam-sdks-java-testing-expansion-service-testExpansionService-2.37.0-SNAPSHOT-omaQVwNVWA6_rt0sc6DjGQpPAfmYvIvrBf17FcJipHE.jar in 6 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:718 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0112062444-840934.1641968684.841438/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:879 Create job: <Job
                                                                           clientRequestId: '20220112062444842341-3474'
                                                                           createTime: '2022-01-12T06:24:53.257225Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2022-01-11_22_24_52-11359699220188505620'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-0112062444-840934'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2022-01-12T06:24:53.257225Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:881 Created job with id: [2022-01-11_22_24_52-11359699220188505620]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:882 Submitted job: 2022-01-11_22_24_52-11359699220188505620
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-11_22_24_52-11359699220188505620?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2022-01-11_22_24_52-11359699220188505620 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:24:55.873Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-01-11_22_24_52-11359699220188505620. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:24:58.181Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-01-11_22_24_52-11359699220188505620.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:00.915Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.724Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.753Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.809Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.840Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step check_odd/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.859Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step check_even/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.883Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.912Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.937Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:01.980Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.005Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.029Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.059Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.087Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.116Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27 for input ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Tag-0-_25.None-post19
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.144Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of check_even/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten check_even/Group/CoGroupByKeyImpl/Flatten, into producer check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.165Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/MapTuple(collect_values) into check_even/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.188Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/RestoreTags into check_even/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.214Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Unkey into check_even/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.237Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Match into check_even/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.260Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45 for input ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Tag-0-_43.None-post25
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.287Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten check_odd/Group/CoGroupByKeyImpl/Flatten, into producer check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.314Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/MapTuple(collect_values) into check_odd/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.337Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/RestoreTags into check_odd/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.385Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Unkey into check_odd/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.439Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Match into check_odd/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.464Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/GroupByKey/Write into check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.490Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write into check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:3228>) into Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:3228>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.569Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.633Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.682Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.705Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.760Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.783Z: JOB_MESSAGE_DETAILED: Fusing consumer ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition) into Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.808Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/WindowInto(WindowIntoFn) into ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.830Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/WindowInto(WindowIntoFn) into ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.853Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Create/FlatMap(<lambda at core.py:3228>) into check_even/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.875Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Create/Map(decode) into check_even/Create/FlatMap(<lambda at core.py:3228>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.893Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/Tag[0] into check_even/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.916Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity into check_even/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.939Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/ToVoidKey into check_even/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.963Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/Tag[1] into check_even/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:02.990Z: JOB_MESSAGE_DETAILED: Fusing consumer check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity into check_even/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.015Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Create/FlatMap(<lambda at core.py:3228>) into check_odd/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.037Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Create/Map(decode) into check_odd/Create/FlatMap(<lambda at core.py:3228>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.056Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/Tag[0] into check_odd/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.078Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity into check_odd/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.102Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/ToVoidKey into check_odd/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.124Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/Tag[1] into check_odd/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.145Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity into check_odd/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.173Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.190Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.211Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.233Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.356Z: JOB_MESSAGE_DEBUG: Executing wait step start39
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.415Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.446Z: JOB_MESSAGE_BASIC: Executing operation check_even/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.451Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.475Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.497Z: JOB_MESSAGE_BASIC: Executing operation check_odd/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.518Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.536Z: JOB_MESSAGE_BASIC: Finished operation check_even/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.551Z: JOB_MESSAGE_BASIC: Finished operation check_odd/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.579Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.613Z: JOB_MESSAGE_DEBUG: Value "check_even/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.647Z: JOB_MESSAGE_DEBUG: Value "check_odd/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.680Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3228>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.703Z: JOB_MESSAGE_BASIC: Executing operation check_even/Create/Impulse+check_even/Create/FlatMap(<lambda at core.py:3228>)+check_even/Create/Map(decode)+check_even/Group/CoGroupByKeyImpl/Tag[0]+check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_even/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:03.753Z: JOB_MESSAGE_BASIC: Executing operation check_odd/Create/Impulse+check_odd/Create/FlatMap(<lambda at core.py:3228>)+check_odd/Create/Map(decode)+check_odd/Group/CoGroupByKeyImpl/Tag[0]+check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:24.619Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:25:48.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:26:32.580Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:26:56.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:27:42.871Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:28:09.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:28:55.075Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:29:22.255Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:30:08.835Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:30:35.507Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.383Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.406Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103 not found: manifest unknown: Failed to fetch "20220112060103" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220112060103".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.471Z: JOB_MESSAGE_BASIC: Finished operation check_even/Create/Impulse+check_even/Create/FlatMap(<lambda at core.py:3228>)+check_even/Create/Map(decode)+check_even/Group/CoGroupByKeyImpl/Tag[0]+check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_even/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.471Z: JOB_MESSAGE_BASIC: Finished operation check_odd/Create/Impulse+check_odd/Create/FlatMap(<lambda at core.py:3228>)+check_odd/Create/Map(decode)+check_odd/Group/CoGroupByKeyImpl/Tag[0]+check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.471Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3228>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.533Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.612Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.639Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:31:22.946Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:32:09.211Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2022-01-12T06:32:09.237Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2022-01-11_22_24_52-11359699220188505620 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
============== 9 failed, 1 passed, 8 warnings in 1013.90 seconds ===============

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava FAILED

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 29571.
Stopping expansion service pid: 29573.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Error: No such image: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220112060103]

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 287

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 32m 20s
137 actionable tasks: 94 executed, 37 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/abqcugw2d472i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Dataflow #1675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1675/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org