You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/24 02:28:20 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #5365

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5365/display/redirect>

Changes:


------------------------------------------
[...truncated 289.64 KB...]
                    {
                      "@type": "kind:bytes"
                    },
                    {
                      "@type": "kind:varint"
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "ref_AppliedPTransform_Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1_37",
        "user_name": "Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2019-12-24T02:04:31.067306Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-12-23_18_04_29-14920219712873808177'
 location: 'us-central1'
 name: 'beamapp-jenkins-1224020417-620401'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-12-24T02:04:31.067306Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2019-12-23_18_04_29-14920219712873808177]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_29-14920219712873808177?project=apache-beam-testing
apache_beam.runners.dataflow.test_dataflow_runner: WARNING: Waiting indefinitely for streaming job.
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-23_18_04_29-14920219712873808177 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:37.286Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:38.215Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.797Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.800Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.810Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.817Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.819Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.832Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.835Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.849Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.871Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.875Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0 into side list/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.877Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1 into side list/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.880Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0 into side list/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.882Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1 into side list/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.884Z: JOB_MESSAGE_DETAILED: Unzipping flatten s24 for input s22.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.888Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.890Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.892Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/FlatMap(<lambda at core.py:2570>) into side list/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.895Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/AddRandomKeys into side list/FlatMap(<lambda at core.py:2570>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.897Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into side list/MaybeReshuffle/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.899Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.902Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.904Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.907Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys into side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.909Z: JOB_MESSAGE_DETAILED: Fusing consumer side list/Map(decode) into side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.912Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey into Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.915Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.918Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.921Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.923Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.926Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey into Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.929Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/WriteStream into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.934Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.936Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/StreamingPCollectionViewWriter into Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.939Z: JOB_MESSAGE_DETAILED: Fusing consumer main input/FlatMap(<lambda at core.py:2570>) into main input/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.942Z: JOB_MESSAGE_DETAILED: Fusing consumer main input/Map(decode) into main input/FlatMap(<lambda at core.py:2570>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.944Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:268>)/Map(<lambda at sideinputs_test.py:268>) into main input/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.946Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:268>)/Map(<lambda at sideinputs_test.py:268>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.949Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2570>) into assert_that/Create/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.951Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2570>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.953Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.956Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.958Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.961Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.964Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.966Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.969Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:39.985Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:40.030Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:40.081Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:40.169Z: JOB_MESSAGE_DEBUG: Executing wait step start2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:40.197Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:40.204Z: JOB_MESSAGE_BASIC: Starting 1 workers...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.011Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/StreamingPCollectionViewWriter
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.011Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2570>)+assert_that/Create/Map(decode)+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.011Z: JOB_MESSAGE_BASIC: Executing operation side list/Impulse+side list/FlatMap(<lambda at core.py:2570>)+side list/MaybeReshuffle/Reshuffle/AddRandomKeys+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.011Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.016Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.017Z: JOB_MESSAGE_BASIC: Executing operation side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys+side list/Map(decode)+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:04:44.040Z: JOB_MESSAGE_BASIC: Executing operation main input/Impulse+main input/FlatMap(<lambda at core.py:2570>)+main input/Map(decode)+Map(<lambda at sideinputs_test.py:268>)/Map(<lambda at sideinputs_test.py:268>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:05.871Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:15.970Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:15.971Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:16.771Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.300Z: JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:2570>)+assert_that/Create/Map(decode)+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/Values+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/StreamingPCollectionViewWriter
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation side list/Impulse+side list/FlatMap(<lambda at core.py:2570>)+side list/MaybeReshuffle/Reshuffle/AddRandomKeys+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys+side list/Map(decode)+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:268>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey0.out.0)/GroupByKey/WriteStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/PairWithVoidKey+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/ReadStream+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKey/MergeBuckets+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/Values+Map(<lambda at sideinputs_test.py:268>)/_UnpickledSideInput(MapToVoidKey1.out.0)/StreamingPCollectionViewWriter
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:17.332Z: JOB_MESSAGE_BASIC: Finished operation main input/Impulse+main input/FlatMap(<lambda at core.py:2570>)+main input/Map(decode)+Map(<lambda at sideinputs_test.py:268>)/Map(<lambda at sideinputs_test.py:268>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:26.237Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:26.263Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:26.269Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:26.272Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:26.280Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:05:35.508Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-24T02:06:57.656Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2019-12-23_18_04_29-14920219712873808177 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_51-8105849980556875967?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_03_59-16613251839095578715?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_19-104381588289452534?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_49-5015789993124393362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_10-2714685194056482647?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_56-12811072028004782700?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_19_42-16602548502907291632?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_51-1027863897894372278?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_05_06-9738035640840424927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_51-14152433961969151100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_16-15613249888068821392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_38-264412759093851291?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_49-7829984151812095286?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_29-14920219712873808177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_07_24-2708274334390184738?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_49-11286717782175408509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_05_05-14335737847798755200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_48-7144709328292293203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_10-6505282591832938768?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_54-17696245203065018560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_50-17078738012978272267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_25-11415209031015038345?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 26 tests in 1890.029s

FAILED (errors=1)

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests FAILED

> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerStreamingTests
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_47-11579229968818890760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_05-308367233801658383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_34-16903035743447892654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_46-3487538351229785808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_50-50039728342109704?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_49-8586474101821554915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_05_28-6641320218340883617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_12_17-16920186792320760701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_47-3429812413384771177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_55-2178524826991860929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_12_19-9420069086384552188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_48-17299646879347383250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_28-6420040056376986828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_47-17695435571086885147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_28-9987492651800491473?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_11_47-8436541693808302044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_46-16335546373932733794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_05_15-11164521119680273338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_17_56_46-18200707023812423027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_04_45-5066603988745850321?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_12_24-14163689607736415886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-23_18_20_37-7956180240185397692?project=apache-beam-testing
test_gbk_many_values (apache_beam.runners.portability.fn_api_runner_test.FnApiBasedStateBackedCoderTest) ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_a_flattened_pcollection (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_pcollections (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_one_single_pcollection (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
Test a GBK sideinput, with multiple triggering. ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 26 tests in 1928.311s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 101

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 30s
74 actionable tasks: 57 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wz6hvfvwgdekc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #5366

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5366/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org