You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/31 21:54:57 UTC

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow #418

See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/418/display/redirect>

------------------------------------------
[...truncated 8.10 MB...]
    Dec 31, 2018 8:52:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$13/RunChecks as step s35
    Dec 31, 2018 8:52:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$13/VerifyAssertions/ParDo(DefaultConclude) as step s36
    Dec 31, 2018 8:52:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-end-to-end-tests/flattentest0testflattenpcollectionsemptythenpardo-jenkins-1231205201-76367de2/output/results/staging/
    Dec 31, 2018 8:52:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <77877 bytes, hash snptCgDQuHaQF7PWLMOYVw> to gs://temp-storage-for-end-to-end-tests/flattentest0testflattenpcollectionsemptythenpardo-jenkins-1231205201-76367de2/output/results/staging/pipeline-snptCgDQuHaQF7PWLMOYVw.pb

org.apache.beam.sdk.transforms.FlattenTest > testFlattenPCollectionsEmptyThenParDo STANDARD_OUT
    Dataflow SDK version: 2.10.0-SNAPSHOT

org.apache.beam.sdk.transforms.FlattenTest > testFlattenPCollectionsEmptyThenParDo STANDARD_ERROR
    Dec 31, 2018 8:52:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-31_12_52_10-2719036602827136060?project=apache-beam-testing

org.apache.beam.sdk.transforms.FlattenTest > testFlattenPCollectionsEmptyThenParDo STANDARD_OUT
    Submitted job: 2018-12-31_12_52_10-2719036602827136060

org.apache.beam.sdk.transforms.FlattenTest > testFlattenPCollectionsEmptyThenParDo STANDARD_ERROR
    Dec 31, 2018 8:52:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-12-31_12_52_10-2719036602827136060
    Dec 31, 2018 8:52:11 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2018-12-31_12_52_10-2719036602827136060 with 0 expected assertions.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:10.195Z: Autoscaling is enabled for job 2018-12-31_12_52_10-2719036602827136060. The number of workers will be between 1 and 1000.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:10.226Z: Autoscaling was automatically enabled for job 2018-12-31_12_52_10-2719036602827136060.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2018-12-31T20:52:12.921Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: flattentest0testflattenpcollectionsemptythenpardo-jenkins--ji6i. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:13.246Z: Checking permissions granted to controller Service Account.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:17.983Z: Worker configuration: n1-standard-1 in us-central1-b.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:18.528Z: Expanding CollectionToSingleton operations into optimizable parts.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:18.619Z: Expanding CoGroupByKey operations into optimizable parts.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:18.883Z: Expanding GroupByKey operations into optimizable parts.
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:18.986Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.032Z: Elided trivial flatten 
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.070Z: Elided trivial flatten 
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.116Z: Elided trivial flatten 
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.152Z: Unzipping flatten s29 for input s17.org.apache.beam.sdk.values.PCollection.<init>:402#a4f9f304fed667ee
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.188Z: Fusing unzipped copy of PAssert$13/GroupGlobally/GroupDummyAndContents/Reify, through flatten PAssert$13/GroupGlobally/FlattenDummyAndContents, into producer PAssert$13/GroupGlobally/KeyForDummy/AddKeys/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.235Z: Fusing consumer PAssert$13/VerifyAssertions/ParDo(DefaultConclude) into PAssert$13/RunChecks
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.279Z: Unzipping flatten s29-u40 for input s31-reify-value18-c38
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.325Z: Fusing unzipped copy of PAssert$13/GroupGlobally/GroupDummyAndContents/Write, through flatten s29-u40, into producer PAssert$13/GroupGlobally/GroupDummyAndContents/Reify
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.371Z: Fusing consumer PAssert$13/GroupGlobally/GroupDummyAndContents/GroupByWindow into PAssert$13/GroupGlobally/GroupDummyAndContents/Read
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.418Z: Fusing consumer PAssert$13/GroupGlobally/Values/Values/Map into PAssert$13/GroupGlobally/GroupDummyAndContents/GroupByWindow
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.456Z: Fusing consumer PAssert$13/GetPane/Map into PAssert$13/GroupGlobally/ParDo(Concat)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.500Z: Fusing consumer PAssert$13/RunChecks into PAssert$13/GetPane/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.552Z: Fusing consumer PAssert$13/GroupGlobally/ParDo(Concat) into PAssert$13/GroupGlobally/Values/Values/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.598Z: Fusing consumer PAssert$13/GroupGlobally/GroupDummyAndContents/Reify into PAssert$13/GroupGlobally/WindowIntoDummy/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.630Z: Fusing consumer PAssert$13/GroupGlobally/GroupDummyAndContents/Write into PAssert$13/GroupGlobally/GroupDummyAndContents/Reify
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.666Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.711Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.748Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Reify into PAssert$13/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.792Z: Fusing consumer PAssert$13/GroupGlobally/WindowIntoDummy/Window.Assign into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.839Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.888Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow into PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Read
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.920Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Write into PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Reify
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:19.967Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource) into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.013Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.055Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.096Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.134Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.180Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.227Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.275Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/Values/Values/Map into PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.321Z: Fusing consumer PAssert$13/GroupGlobally/Window.Into()/Window.Assign into ParDo(Identity)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.358Z: Fusing consumer PAssert$13/GroupGlobally/RewindowActuals/Window.Assign into PAssert$13/GroupGlobally/GatherAllOutputs/Values/Values/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.394Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.434Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource) into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.476Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key into Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.534Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign into PAssert$13/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.572Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.613Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map into PAssert$13/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.662Z: Fusing consumer PAssert$13/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) into PAssert$13/GroupGlobally/Window.Into()/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.700Z: Fusing consumer PAssert$13/GroupGlobally/KeyForDummy/AddKeys/Map into PAssert$13/GroupGlobally/RewindowActuals/Window.Assign
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.744Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.784Z: Fusing consumer Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource) into Flatten.PCollections/Create.Values/Read(CreateSource)/Impulse
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.828Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource) into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Impulse
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.873Z: Fusing consumer PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:20.918Z: Fusing consumer ParDo(Identity) into Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.369Z: Executing operation PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Create
    Dec 31, 2018 8:52:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.416Z: Executing operation PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Dec 31, 2018 8:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.453Z: Executing operation PAssert$13/GroupGlobally/GroupDummyAndContents/Create
    Dec 31, 2018 8:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.473Z: Starting 1 workers in us-central1-b...
    Dec 31, 2018 8:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.499Z: Executing operation Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Dec 31, 2018 8:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.745Z: Executing operation PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Impulse+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Dec 31, 2018 8:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:21.789Z: Executing operation Flatten.PCollections/Create.Values/Read(CreateSource)/Impulse+Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair with random key+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Dec 31, 2018 8:52:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:52:44.849Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Dec 31, 2018 8:53:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:53:19.478Z: Workers have started successfully.
    Dec 31, 2018 8:53:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:53:29.344Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Dec 31, 2018 8:53:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:53:29.390Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Dec 31, 2018 8:53:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T20:53:50.543Z: Workers have started successfully.
    Dec 31, 2018 9:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2018-12-31T21:52:21.857Z: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
    Dec 31, 2018 9:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T21:52:22.101Z: Cancel request is committed for workflow job: 2018-12-31_12_52_10-2719036602827136060.
    Dec 31, 2018 9:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T21:52:22.194Z: Cleaning up.
    Dec 31, 2018 9:52:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T21:52:22.289Z: Stopping worker pool...
    Dec 31, 2018 9:52:23 PM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process
    INFO: Dataflow job 2018-12-31_12_52_10-2719036602827136060 threw exception. Failure message was: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
    Dec 31, 2018 9:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T21:54:47.446Z: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
    Dec 31, 2018 9:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-12-31T21:54:47.489Z: Worker pool stopped.
    Dec 31, 2018 9:54:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-12-31_12_52_10-2719036602827136060 failed with status FAILED.
    Dec 31, 2018 9:54:53 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-12-31_12_52_10-2719036602827136060. Found 0 success, 0 failures out of 0 expected assertions.

org.apache.beam.sdk.transforms.FlattenTest > testFlattenPCollectionsEmptyThenParDo FAILED
    java.lang.RuntimeException: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
        at org.apache.beam.sdk.transforms.FlattenTest.testFlattenPCollectionsEmptyThenParDo(FlattenTest.java:224)

Gradle Test Executor 116 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest FAILED

71 tests completed, 3 failed, 1 skipped
Finished generating test XML results (0.168 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/test-results/validatesRunnerFnApiWorkerExecutableStageTest>
Generating HTML test report...
Finished generating test html results (0.205 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerFnApiWorkerExecutableStageTest>
:beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest (Thread[Task worker for ':' Thread 77,5,main]) completed. Took 3 hrs 52 mins 4.736 secs.
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task worker for ':' Thread 77,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java:cleanUpDockerImages FAILED
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181231180019
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181231180019
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:5b9986ac596bbfe09e1b752f2476e4ab9f9cec4e80964e3b4b9c16557dab5382
Starting process 'command 'gcloud''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181231180019
Successfully started process 'command 'gcloud''
ERROR: (gcloud.container.images.delete) [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181231180019] is not a valid name. Expected tag in the form "base:tag" or "tag" or digest in the form "sha256:<digest>"
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task worker for ':' Thread 77,5,main]) completed. Took 0.795 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerFnApiWorkerExecutableStageTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 459

* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 54m 46s
76 actionable tasks: 70 executed, 5 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/mwck7foshsbza

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow #419

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/419/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org