You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/08/20 00:27:31 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Dataflow #1093

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1093/display/redirect?page=changes>

Changes:

[noreply] Change conflicting StateReader name from side to reader (#15348)

[noreply] [BEAM-3304] Go triggering support (#15239)


------------------------------------------
[...truncated 298.59 KB...]
 createTime: '2021-08-20T00:17:15.751376Z'
 currentState: CurrentStateValueValuesEnum(JOB...021-08-20T00:17:15.751376Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f6a1bf74fd0>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely.')
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1635: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:300 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
WARNING  root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.33.0.dev
INFO     root:environments.py:296 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809
INFO     root:environments.py:304 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20210809" for Docker environment
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:650 ==================== <function pack_combiners at 0x7f69874d0598> ====================
INFO     apache_beam.runners.portability.fn_api_runner.translations:translations.py:650 ==================== <function sort_stages at 0x7f69874d0d08> ====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:452 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/beam-sdks-java-testing-expansion-service-testExpansionService-2.33.0-SNAPSHOT-DdeK58OVcotVRFDx075qsAf4qwKBzla1AuxWELTh-Sw.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/beam-sdks-java-testing-expansion-service-testExpansionService-2.33.0-SNAPSHOT-DdeK58OVcotVRFDx075qsAf4qwKBzla1AuxWELTh-Sw.jar in 5 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/dataflow_python_sdk.tar in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:631 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:650 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0820001707-925614.1629418627.926167/pipeline.pb in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:797 Create job: <Job
                                                                           createTime: '2021-08-20T00:17:15.751376Z'
                                                                           currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: '2021-08-19_17_17_14-17606721475194521700'
                                                                           location: 'us-central1'
                                                                           name: 'beamapp-jenkins-0820001707-925614'
                                                                           projectId: 'apache-beam-testing'
                                                                           stageStates: []
                                                                           startTime: '2021-08-20T00:17:15.751376Z'
                                                                           steps: []
                                                                           tempFiles: []
                                                                           type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:799 Created job with id: [2021-08-19_17_17_14-17606721475194521700]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:800 Submitted job: 2021-08-19_17_17_14-17606721475194521700
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:806 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-19_17_17_14-17606721475194521700?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-08-19_17_17_14-17606721475194521700 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:18.506Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-08-19_17_17_14-17606721475194521700. The number of workers will be between 1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:18.607Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-08-19_17_17_14-17606721475194521700.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:23.869Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.748Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.781Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.852Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.894Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/_CoGBKImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.918Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.949Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:24.977Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.012Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.039Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-_CoGBKImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.083Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_assert_that-Group-_CoGBKImpl-Flatten_27
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.122Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_assert_that-Group-_CoGBKImpl-Flatten_27 for input ref_AppliedPTransform_assert_that-Group-_CoGBKImpl-Tag-0-_25.None-post14
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.179Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/_CoGBKImpl/GroupByKey/Write, through flatten assert_that/Group/_CoGBKImpl/Flatten, into producer assert_that/Group/_CoGBKImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.216Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/MapTuple(collect_values) into assert_that/Group/_CoGBKImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.267Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/RestoreTags into assert_that/Group/_CoGBKImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.300Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.341Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.373Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/GroupByKey/Write into assert_that/Group/_CoGBKImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.395Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2968>) into Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2968>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.478Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.504Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.544Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.582Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.615Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.710Z: JOB_MESSAGE_DETAILED: Fusing consumer ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous) into Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.735Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into ExternalTransform(beam:transforms:xlang:test:prefix)/Map/ParMultiDo(Anonymous)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.767Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2968>) into assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.803Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2968>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.827Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.866Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/Flatten/InputIdentity into assert_that/Group/_CoGBKImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.898Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.924Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.947Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/_CoGBKImpl/Flatten/InputIdentity into assert_that/Group/_CoGBKImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:25.982Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.012Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.048Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.108Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.387Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.532Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.571Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/_CoGBKImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.584Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.613Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.644Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.658Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/_CoGBKImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.709Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.753Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/_CoGBKImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.793Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:2968>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:26.823Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2968>)+assert_that/Create/Map(decode)+assert_that/Group/_CoGBKImpl/Tag[0]+assert_that/Group/_CoGBKImpl/Flatten/InputIdentity+assert_that/Group/_CoGBKImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:17:50.070Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:18:13.969Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:19:01.497Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:19:28.670Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:20:13.831Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:20:36.799Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:21:35.580Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:22:01.635Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:24:50.264Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:25:15.502Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.153Z: JOB_MESSAGE_WARNING: A worker was unable to start up.  Error: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.188Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job appears to be stuck. Several workers have failed to start up in a row, and no worker has successfully started up for this job. Last error reported: Unable to pull container image due to error: image pull request failed with error: Error response from daemon: manifest for us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108 not found: manifest unknown: Failed to fetch "20210820000108" from request "/v2/apache-beam-testing/java-postcommit-it/java/manifests/20210820000108".. This is likely due to an invalid SDK container image URL. Please verify any provided SDK container image is valid and that Dataflow workers have permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.237Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2968>)+assert_that/Create/Map(decode)+assert_that/Group/_CoGBKImpl/Tag[0]+assert_that/Group/_CoGBKImpl/Flatten/InputIdentity+assert_that/Group/_CoGBKImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.280Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:2968>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.352Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.429Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.462Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:00.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:47.346Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 2021-08-20T00:26:47.373Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 2021-08-19_17_17_14-17606721475194521700 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # pylint: disable=anomalous-backslash-in-string

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml> -
=============== 7 failed, 1 passed, 2 warnings in 641.82 seconds ===============

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava FAILED

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 18498.
Stopping expansion service pid: 18503.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:902b66156dedf24990bcda7092c7121ab4282feb0df29e12c626e378369ed39e
Deleted: sha256:3aedd4507c090ff409dc67e2f746f8e13c1b2d9b8fda858ad8e92dd88e6abf44
Deleted: sha256:4b4cd75ff5ae73ed6d31b03bc8978675b777d33906825da8deb8f6643837ffdd
Deleted: sha256:113592febc43f3a2deaef2ead88814b3caf15401c6fb0ca078a2327133fd753b
Deleted: sha256:ce1768c17731a0652bd7451e4b95bfec75a50b892c783f0ba0a2677083962fb2
Deleted: sha256:f655b8dc0230120f3d7f16beb38a9865389edbb495f23d85e9fd670c69ec6619
Deleted: sha256:bef525c68d3f56cd36cf468c0bf6e624ff57f72bb0f7aef407b73b93c7894afe
Deleted: sha256:675bfefe87b5745eaa1df534e1515f69e873854e66fe5e38574ce3dd560ec529
Deleted: sha256:89bcbad7f66efc568f00eba4f7b8bd601a2551b4b9fa57a02b9262cc4791b608
Deleted: sha256:09cdb14e61b7284d1b6a2b47d3bd6d56435d871ad705972f24256783e6020a3c
Deleted: sha256:8177b63171b7e83e3e35890e47f45d929386216fa22f99c264eaa8ffd5ce369e
Deleted: sha256:0522961c1ac4c383c76b65aea145d03df1031ed9a4afa4590f819ff786672d85
Deleted: sha256:fc8bc34f6032877910cb9d65dee712c42d9dc4dc73d7d687955c23c2b418d11c
Deleted: sha256:ebb79327504dba01053b9296c57c2e74694f25847fcff0660c60bfbf94a7cac9
Deleted: sha256:8bad1a9ddbdb11c615cd7c51a8e72f82d59f86f14a8bc997e33544b77ca86136
Deleted: sha256:20fcd047fb589a9ad12da2be9851c2a93fd1fedf1b0403f2fb61b077e4cb608a
Deleted: sha256:2106e62bedea3fa2e6bed497435d57ab23b9e8ae56459dec9b68cd2ad15343c8
Deleted: sha256:5d0f6a77a0d72731ba9e83b615e4c007c5050e2339eab59ccac6308b48a0dd1c
Deleted: sha256:1dcd5b353cd30608334ff2de7798b28b573bba30c39a77e7bc98eda76096286d
Deleted: sha256:e788b8a132c7fcdd38ef22058629f0e8ffb96f95697fc18475da137135aeb997
Deleted: sha256:03d3047261f2f37a1fcd258acb345fe2d45c094fc80b1609c112f703af81e767
Deleted: sha256:860cb5a5d7d79909cde013c14a93f3955f3cab3f2309cae9969dc07e5cc4195b
Deleted: sha256:14bc2e9cf2c903174a71c1baf44f41d365eb56e999ce318a4ec7ae788e78ee75
Deleted: sha256:3bc2accdda1d96aa16f6bca4c4c246ab9c2c88688990a968d1685dc6619a444e
Deleted: sha256:68034892245d6e46dbc1f01098982dcda712d0fdce0648e33a752fae0c68fc87
Deleted: sha256:c05340aa65b2b9e70e381cfd383277f74b756d7ff900aaff72e2687f90632f6f
Deleted: sha256:60ec23167d938833d67f3f2290d1a6272606dab4a3ce2c1e4e5ddec351811550
Deleted: sha256:7ea118d783202cc81ce382d51004382cbc74b5ff5fc7e20ea0f79c9973051562
Deleted: sha256:63c7c19254962c48a9e4dcce30f8e1d69852ebbc7e63ed95693bc418fcb30a44
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210820000108]

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 279

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 57s
105 actionable tasks: 76 executed, 25 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/lpo6o6m2os6ju

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Dataflow #1094

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1094/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org