You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/01/05 02:40:03 UTC

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #1792

See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/1792/display/redirect?page=changes>

Changes:

[github] Reduce days to keep Jenkins job logs to 14

------------------------------------------
[...truncated 20.70 MB...]
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView as step s9
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create123/Read(CreateSource) as step s10
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding OutputSideInputs as step s11
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/Window.Into()/Window.Assign as step s12
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) as step s13
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map as step s14
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign as step s15
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey as step s16
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map as step s17
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/RewindowActuals/Window.Assign as step s18
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map as step s19
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/RemoveActualsTriggering/Flatten.PCollections as step s20
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/Create.Values/Read(CreateSource) as step s21
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign as step s22
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/RemoveDummyTriggering/Flatten.PCollections as step s23
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/FlattenDummyAndContents as step s24
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/NeverTrigger/Flatten.PCollections as step s25
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/GroupDummyAndContents as step s26
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/Values/Values/Map as step s27
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GroupGlobally/ParDo(Concat) as step s28
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/GetPane/Map as step s29
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/RunChecks as step s30
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$33/VerifyAssertions/ParDo(DefaultConclude) as step s31
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-0105023603-c92551c2/output/results/staging/
    Jan 05, 2019 2:36:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <70882 bytes, hash H0xS7U_BcBYReyeLht6QPg> to gs://temp-storage-for-validates-runner-tests//viewtest0testsingletonsideinput-jenkins-0105023603-c92551c2/output/results/staging/pipeline-H0xS7U_BcBYReyeLht6QPg.pb

org.apache.beam.sdk.transforms.ViewTest > testSingletonSideInput STANDARD_OUT
    Dataflow SDK version: 2.10.0-SNAPSHOT

org.apache.beam.sdk.transforms.ViewTest > testSingletonSideInput STANDARD_ERROR
    Jan 05, 2019 2:36:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-04_18_36_08-9988148273497037146?project=apache-beam-testing

org.apache.beam.sdk.transforms.ViewTest > testSingletonSideInput STANDARD_OUT
    Submitted job: 2019-01-04_18_36_08-9988148273497037146

org.apache.beam.sdk.transforms.ViewTest > testSingletonSideInput STANDARD_ERROR
    Jan 05, 2019 2:36:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2019-01-04_18_36_08-9988148273497037146
    Jan 05, 2019 2:36:10 AM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2019-01-04_18_36_08-9988148273497037146 with 1 expected assertions.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:09.055Z: Autoscaling is enabled for job 2019-01-04_18_36_08-9988148273497037146. The number of workers will be between 1 and 1000.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:09.089Z: Autoscaling was automatically enabled for job 2019-01-04_18_36_08-9988148273497037146.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:44.329Z: Checking permissions granted to controller Service Account.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:50.768Z: Worker configuration: n1-standard-1 in us-central1-b.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:51.387Z: Expanding CoGroupByKey operations into optimizable parts.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:51.668Z: Expanding GroupByKey operations into optimizable parts.
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:51.729Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.111Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.156Z: Elided trivial flatten 
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.215Z: Elided trivial flatten 
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.263Z: Elided trivial flatten 
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.322Z: Unzipping flatten s24 for input s19.org.apache.beam.sdk.values.PCollection.<init>:402#f234eb3f2c0e3f86
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.385Z: Fusing unzipped copy of PAssert$33/GroupGlobally/GroupDummyAndContents/Reify, through flatten PAssert$33/GroupGlobally/FlattenDummyAndContents, into producer PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.440Z: Fusing consumer PAssert$33/GroupGlobally/GroupDummyAndContents/GroupByWindow into PAssert$33/GroupGlobally/GroupDummyAndContents/Read
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.493Z: Fusing consumer PAssert$33/GroupGlobally/ParDo(Concat) into PAssert$33/GroupGlobally/Values/Values/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.543Z: Fusing consumer PAssert$33/GetPane/Map into PAssert$33/GroupGlobally/ParDo(Concat)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.602Z: Fusing consumer PAssert$33/GroupGlobally/Values/Values/Map into PAssert$33/GroupGlobally/GroupDummyAndContents/GroupByWindow
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.637Z: Fusing consumer PAssert$33/RunChecks into PAssert$33/GetPane/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.697Z: Fusing consumer PAssert$33/VerifyAssertions/ParDo(DefaultConclude) into PAssert$33/RunChecks
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.743Z: Unzipping flatten s24-u45 for input s26-reify-value18-c43
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.791Z: Fusing unzipped copy of PAssert$33/GroupGlobally/GroupDummyAndContents/Write, through flatten s24-u45, into producer PAssert$33/GroupGlobally/GroupDummyAndContents/Reify
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.848Z: Fusing consumer PAssert$33/GroupGlobally/GroupDummyAndContents/Reify into PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.902Z: Fusing consumer PAssert$33/GroupGlobally/GroupDummyAndContents/Write into PAssert$33/GroupGlobally/GroupDummyAndContents/Reify
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:52.962Z: Fusing consumer PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign into PAssert$33/GroupGlobally/Create.Values/Read(CreateSource)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.016Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map into PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.060Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Write into PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Reify
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.108Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) into PAssert$33/GroupGlobally/Window.Into()/Window.Assign
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.165Z: Fusing consumer OutputSideInputs into Create123/Read(CreateSource)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.217Z: Fusing consumer PAssert$33/GroupGlobally/Window.Into()/Window.Assign into OutputSideInputs
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.273Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map into PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.339Z: Fusing consumer PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map into PAssert$33/GroupGlobally/RewindowActuals/Window.Assign
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.401Z: Fusing consumer PAssert$33/GroupGlobally/RewindowActuals/Window.Assign into PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.457Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow into PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Read
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.518Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Reify into PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.576Z: Fusing consumer PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign into PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.630Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.693Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.739Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.794Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into Create47/Read(CreateSource)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.853Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.913Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:53.964Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:54.024Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:54.094Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:54.152Z: Fusing consumer View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:54.932Z: Executing operation PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Create
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.000Z: Executing operation PAssert$33/GroupGlobally/GroupDummyAndContents/Create
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.042Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.064Z: Starting 1 workers in us-central1-b...
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.110Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.801Z: Executing operation PAssert$33/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$33/GroupGlobally/GroupDummyAndContents/Reify+PAssert$33/GroupGlobally/GroupDummyAndContents/Write
    Jan 05, 2019 2:36:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:36:55.890Z: Executing operation Create47/Read(CreateSource)+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Jan 05, 2019 2:37:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:07.236Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jan 05, 2019 2:37:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:40.937Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jan 05, 2019 2:37:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:41.005Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jan 05, 2019 2:37:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:41.612Z: Workers have started successfully.
    Jan 05, 2019 2:37:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:41.663Z: Workers have started successfully.
    Jan 05, 2019 2:38:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:58.279Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Jan 05, 2019 2:38:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:37:58.499Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Jan 05, 2019 2:38:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:07.561Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Jan 05, 2019 2:38:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:07.778Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Jan 05, 2019 2:38:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:17.681Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    Jan 05, 2019 2:38:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:18.034Z: Executing operation Create123/Read(CreateSource)+OutputSideInputs+PAssert$33/GroupGlobally/Window.Into()/Window.Assign+PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)+PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map+PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign+PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Reify+PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Write
    Jan 05, 2019 2:38:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:24.456Z: Executing operation PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Close
    Jan 05, 2019 2:38:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:24.690Z: Executing operation PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/Read+PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow+PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map+PAssert$33/GroupGlobally/RewindowActuals/Window.Assign+PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map+PAssert$33/GroupGlobally/GroupDummyAndContents/Reify+PAssert$33/GroupGlobally/GroupDummyAndContents/Write
    Jan 05, 2019 2:38:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:29.546Z: Executing operation PAssert$33/GroupGlobally/GroupDummyAndContents/Close
    Jan 05, 2019 2:38:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:29.728Z: Executing operation PAssert$33/GroupGlobally/GroupDummyAndContents/Read+PAssert$33/GroupGlobally/GroupDummyAndContents/GroupByWindow+PAssert$33/GroupGlobally/Values/Values/Map+PAssert$33/GroupGlobally/ParDo(Concat)+PAssert$33/GetPane/Map+PAssert$33/RunChecks+PAssert$33/VerifyAssertions/ParDo(DefaultConclude)
    Jan 05, 2019 2:38:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:34.705Z: Cleaning up.
    Jan 05, 2019 2:38:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:38:34.823Z: Stopping worker pool...
    Jan 05, 2019 2:39:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:39:53.961Z: Autoscaling: Resized worker pool from 1 to 0.
    Jan 05, 2019 2:39:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:39:54.072Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jan 05, 2019 2:39:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-01-05T02:39:54.132Z: Worker pool stopped.
    Jan 05, 2019 2:40:00 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2019-01-04_18_36_08-9988148273497037146 finished with status DONE.
    Jan 05, 2019 2:40:00 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2019-01-04_18_36_08-9988148273497037146. Found 1 success, 0 failures out of 1 expected assertions.

Gradle Test Executor 104 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest FAILED

216 tests completed, 1 failed, 2 skipped
Finished generating test XML results (0.525 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/test-results/validatesRunnerLegacyWorkerTest>
Generating HTML test report...
Finished generating test html results (0.586 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerLegacyWorkerTest>
:beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 2 hrs 23 mins 33.889 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerLegacyWorkerTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 25m 44s
58 actionable tasks: 53 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/pdg37xvqtlabm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow #1795

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/1795/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #1794

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/1794/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam6 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6c024c6400b69a743970f8561e0214e1087a5288 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c024c6400b69a743970f8561e0214e1087a5288
Commit message: "Merge pull request #7292 from bramp/BEAM-6155"
 > git rev-list --no-walk 6c024c6400b69a743970f8561e0214e1087a5288 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/gradlew> --continue --max-workers=120 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :beam-runners-google-cloud-dataflow-java:validatesRunner
To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/4.10.3/userguide/gradle_daemon.html.
Daemon will be stopped at the end of the build stopping after processing
Parallel execution is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:processResources NO-SOURCE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:check
> Task :buildSrc:build
Parallel execution with configuration on demand is an incubating feature.

> Configure project :beam-model-pipeline
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/pipeline/v1/**]] for project beam-model-pipeline

> Configure project :beam-model-job-management
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/jobmanagement/v1/**]] for project beam-model-job-management

> Configure project :beam-model-fn-execution
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/fnexecution/v1/**]] for project beam-model-fn-execution

> Configure project :beam-runners-google-cloud-dataflow-java-windmill
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/runners/dataflow/worker/windmill/**]] for project beam-runners-google-cloud-dataflow-java-windmill

> Task :beam-runners-java-fn-execution:processResources NO-SOURCE
> Task :beam-vendor-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-runners-core-java:processResources NO-SOURCE
> Task :beam-sdks-java-core:generateAvroProtocol NO-SOURCE
> Task :beam-sdks-java-fn-execution:processResources NO-SOURCE
> Task :beam-sdks-java-extensions-google-cloud-platform-core:processResources NO-SOURCE
> Task :beam-sdks-java-io-google-cloud-platform:processResources NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:processResources NO-SOURCE
> Task :beam-sdks-java-harness:processResources NO-SOURCE
> Task :beam-runners-core-construction-java:processResources NO-SOURCE
> Task :beam-sdks-java-core:generateAvroJava NO-SOURCE
> Task :beam-model-job-management:extractProto
> Task :beam-sdks-java-extensions-protobuf:extractProto
> Task :beam-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:generateTestAvroProtocol NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java:processResources
> Task :beam-model-job-management:processResources
> Task :beam-model-fn-execution:extractProto
> Task :beam-model-fn-execution:processResources
> Task :beam-sdks-java-core:generateTestAvroJava
> Task :beam-sdks-java-core:processTestResources NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractIncludeProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractProto
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-model-pipeline:extractProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:generateProto
> Task :beam-model-pipeline:generateProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:compileJava FROM-CACHE
> Task :beam-runners-google-cloud-dataflow-java-windmill:processResources
> Task :beam-runners-google-cloud-dataflow-java-windmill:classes
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes
> Task :beam-runners-google-cloud-dataflow-java-windmill:shadowJar
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:shadowJar
> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:classes
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:shadowJar
> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE

> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-protobuf:classes
> Task :beam-vendor-sdks-java-extensions-protobuf:classes

> Task :beam-sdks-java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-fn-execution:classes

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/util/GcsUtil.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:classes
> Task :beam-vendor-sdks-java-extensions-protobuf:shadowJar
> Task :beam-sdks-java-extensions-protobuf:shadowJar
> Task :beam-sdks-java-fn-execution:shadowJar
> Task :beam-sdks-java-extensions-google-cloud-platform-core:shadowJar

> Task :beam-runners-core-construction-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-core-construction-java:classes
> Task :beam-runners-core-construction-java:shadowJar
> Task :beam-sdks-java-core:compileTestJava

> Task :beam-sdks-java-io-google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-io-google-cloud-platform:classes

> Task :beam-runners-core-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-io-google-cloud-platform:shadowJar
> Task :beam-runners-core-java:classes

> Task :beam-sdks-java-core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-core:testClasses
> Task :beam-runners-core-java:shadowJar

> Task :beam-runners-google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-google-cloud-dataflow-java:classes

> Task :beam-sdks-java-harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-harness:classes
> Task :beam-sdks-java-harness:jar
> Task :beam-runners-google-cloud-dataflow-java:shadowJar
> Task :beam-sdks-java-core:shadowTestJar
> Task :beam-sdks-java-harness:shadowJar

> Task :beam-runners-java-fn-execution:compileJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/ServerFactory.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-java-fn-execution:classes
> Task :beam-runners-java-fn-execution:shadowJar

> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:classes
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:shadowJar
> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest

org.apache.beam.sdk.transforms.ParDoTest$TimerCoderInferenceTests > testValueStateCoderInferenceFromInputCoder FAILED
    java.lang.RuntimeException at ParDoTest.java:3306

216 tests completed, 1 failed, 2 skipped

> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerLegacyWorkerTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 21m 44s
58 actionable tasks: 53 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/yumuqgnmmv57m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow #1793

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/1793/display/redirect?page=changes>

Changes:

[bramp] Plumb the contexts though the gcsx library.

[boyuanz] Improve BigQuery test utils and BigQueryToTableIT

[boyuanz] Fix Andrew's comments

------------------------------------------
[...truncated 118 B...]
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6c024c6400b69a743970f8561e0214e1087a5288 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6c024c6400b69a743970f8561e0214e1087a5288
Commit message: "Merge pull request #7292 from bramp/BEAM-6155"
 > git rev-list --no-walk a0bc8a458261a4d49f69ae084643454417cd1da9 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/gradlew> --continue --max-workers=120 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :beam-runners-google-cloud-dataflow-java:validatesRunner
To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/4.10.3/userguide/gradle_daemon.html.
Daemon will be stopped at the end of the build stopping after processing
Parallel execution is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:processResources NO-SOURCE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:check
> Task :buildSrc:build
Parallel execution with configuration on demand is an incubating feature.

> Configure project :beam-model-pipeline
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/pipeline/v1/**]] for project beam-model-pipeline

> Configure project :beam-model-job-management
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/jobmanagement/v1/**]] for project beam-model-job-management

> Configure project :beam-model-fn-execution
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/model/fnexecution/v1/**]] for project beam-model-fn-execution

> Configure project :beam-runners-google-cloud-dataflow-java-windmill
applyPortabilityNature with [shadowJarValidationExcludes:[org/apache/beam/runners/dataflow/worker/windmill/**]] for project beam-runners-google-cloud-dataflow-java-windmill

> Task :beam-vendor-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-sdks-java-fn-execution:processResources NO-SOURCE
> Task :beam-sdks-java-core:generateAvroProtocol NO-SOURCE
> Task :beam-runners-java-fn-execution:processResources NO-SOURCE
> Task :beam-runners-core-construction-java:processResources NO-SOURCE
> Task :beam-sdks-java-harness:processResources NO-SOURCE
> Task :beam-sdks-java-extensions-google-cloud-platform-core:processResources NO-SOURCE
> Task :beam-sdks-java-io-google-cloud-platform:processResources NO-SOURCE
> Task :beam-runners-core-java:processResources NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:processResources NO-SOURCE
> Task :beam-sdks-java-core:generateAvroJava NO-SOURCE
> Task :beam-model-fn-execution:extractProto
> Task :beam-model-job-management:extractProto
> Task :beam-sdks-java-extensions-protobuf:extractProto
> Task :beam-model-fn-execution:processResources
> Task :beam-sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :beam-model-job-management:processResources
> Task :beam-runners-google-cloud-dataflow-java:processResources
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:generateTestAvroProtocol NO-SOURCE
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractIncludeProto
> Task :beam-model-pipeline:extractProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:extractProto
> Task :beam-sdks-java-core:generateTestAvroJava
> Task :beam-sdks-java-core:processTestResources NO-SOURCE
> Task :beam-runners-google-cloud-dataflow-java-windmill:generateProto
> Task :beam-model-pipeline:generateProto
> Task :beam-runners-google-cloud-dataflow-java-windmill:compileJava FROM-CACHE
> Task :beam-runners-google-cloud-dataflow-java-windmill:processResources
> Task :beam-runners-google-cloud-dataflow-java-windmill:classes
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes
> Task :beam-runners-google-cloud-dataflow-java-windmill:shadowJar
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:classes
> Task :beam-sdks-java-core:shadowJar
> Task :beam-sdks-java-extensions-protobuf:extractIncludeProto
> Task :beam-sdks-java-extensions-protobuf:generateProto NO-SOURCE

> Task :beam-vendor-sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-protobuf:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-protobuf:classes
> Task :beam-vendor-sdks-java-extensions-protobuf:classes

> Task :beam-sdks-java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-fn-execution:classes

> Task :beam-sdks-java-extensions-google-cloud-platform-core:compileJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/sdks/java/extensions/google-cloud-platform-core/src/main/java/org/apache/beam/sdk/util/GcsUtil.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:classes
> Task :beam-vendor-sdks-java-extensions-protobuf:shadowJar
> Task :beam-sdks-java-extensions-protobuf:shadowJar
> Task :beam-sdks-java-fn-execution:shadowJar
> Task :beam-sdks-java-extensions-google-cloud-platform-core:shadowJar

> Task :beam-runners-core-construction-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-core-construction-java:classes
> Task :beam-runners-core-construction-java:shadowJar

> Task :beam-sdks-java-io-google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-io-google-cloud-platform:classes
> Task :beam-sdks-java-io-google-cloud-platform:shadowJar

> Task :beam-runners-core-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-core-java:classes

> Task :beam-sdks-java-core:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-core-java:shadowJar
> Task :beam-sdks-java-core:testClasses

> Task :beam-runners-google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-google-cloud-dataflow-java:classes

> Task :beam-sdks-java-harness:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-google-cloud-dataflow-java:shadowJar
> Task :beam-sdks-java-harness:classes
> Task :beam-sdks-java-harness:jar
> Task :beam-sdks-java-core:shadowTestJar
> Task :beam-sdks-java-harness:shadowJar

> Task :beam-runners-java-fn-execution:compileJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/java-fn-execution/src/main/java/org/apache/beam/runners/fnexecution/ServerFactory.java> uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-java-fn-execution:classes
> Task :beam-runners-java-fn-execution:shadowJar

> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:classes
> Task :beam-runners-google-cloud-dataflow-java-legacy-worker:shadowJar
> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest

org.apache.beam.sdk.transforms.CombineTest$WindowingTests > testCombineGloballyInstanceMethodReference FAILED
    java.lang.RuntimeException at CombineTest.java:1420

org.apache.beam.sdk.testing.PAssertTest > testContainsInAnyOrder FAILED
    java.lang.RuntimeException at PAssertTest.java:378

216 tests completed, 2 failed, 2 skipped

> Task :beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:validatesRunnerLegacyWorkerTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerLegacyWorkerTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 34m 27s
58 actionable tasks: 53 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/6qodejb2bg5no

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org