You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/10/31 06:27:16 UTC

Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2227

See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2227/display/redirect>

Changes:


------------------------------------------
[...truncated 309.69 KB...]
    INFO: Adding Collect read time as step s2
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213482 bytes, hash 865dd601391af8407907fc2593141261dd496f2547dc4d893d39323c18fe7ffe> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-hl3WATka-EB5B_wlkxQSYd1JbyVH3E2JPTkyPBj-f_4.pb
    Oct 31, 2020 6:24:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 6:24:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_23_24_17-6323082367080418887?project=apache-beam-testing
    Oct 31, 2020 6:24:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_23_24_17-6323082367080418887
    Oct 31, 2020 6:24:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_23_24_17-6323082367080418887
    Oct 31, 2020 6:24:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T06:24:21.755Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-103-8j02. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Oct 31, 2020 6:24:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:33.321Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:33.970Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.071Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.106Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.241Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.269Z: Elided trivial flatten 
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.304Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.339Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.371Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.387Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.417Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.455Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.495Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.522Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.571Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.605Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.629Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.668Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.712Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.746Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.774Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.824Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.858Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.894Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.924Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.947Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:34.978Z: Fusing consumer Get values only/Values/Map into Collect read time
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.013Z: Fusing consumer Values as string into Get values only/Values/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.044Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.071Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.106Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.140Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.173Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.198Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.234Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.270Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 31, 2020 6:24:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.303Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.827Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.864Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.906Z: Starting 5 ****s in us-central1-f...
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.958Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:35.974Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:36.082Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:24:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:36.128Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 6:24:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:58.597Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:24:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:24:58.626Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Oct 31, 2020 6:25:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:01.282Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 6:25:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:03.882Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:25:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:20.994Z: Workers have started successfully.
    Oct 31, 2020 6:25:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:21.028Z: Workers have started successfully.
    Oct 31, 2020 6:25:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:45.580Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:25:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:56.385Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 6:25:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:56.458Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 6:25:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:56.510Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 6:25:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:25:56.580Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:05.833Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:06.020Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 6:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:06.078Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 6:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:06.231Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:08.217Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:08.285Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 6:26:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:08.337Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 6:26:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:08.411Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 6:26:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:12.676Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 6:26:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:12.833Z: Cleaning up.
    Oct 31, 2020 6:26:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:26:12.906Z: Stopping **** pool...
    Oct 31, 2020 6:27:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:27:07.573Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 31, 2020 6:27:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:27:07.615Z: Worker pool stopped.
    Oct 31, 2020 6:27:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_23_24_17-6323082367080418887 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): d74a6673-85e4-4be8-a51f-06860a3d76c8 and timestamp: 2020-10-31T06:27:12.949000000Z:
                     Metric:                    Value:
                   read_time                    10.573

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 6 mins 39.814 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 34s
95 actionable tasks: 60 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/xylm6w63vgddk

Stopped 2 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_HadoopFormat #2237

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2237/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2236/display/redirect>

Changes:


------------------------------------------
[...truncated 226.79 KB...]
:examples:java:compileTestJava (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.144 secs.
:examples:java:testClasses (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :examples:java:testClasses
Skipping task ':examples:java:testClasses' as it has no actions.
:examples:java:testClasses (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:examples:java:testJar (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :examples:java:testJar
Caching disabled for task ':examples:java:testJar' because:
  Caching has not been enabled for the task
Task ':examples:java:testJar' is not up-to-date because:
  No history is available.
:examples:java:testJar (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.022 secs.

> Task :runners:direct-java:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/direct-java/build/resources/main'.>
Custom actions are attached to task ':runners:direct-java:shadowJar'.
Caching disabled for task ':runners:direct-java:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/direct-java/build/resources/main',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 0.658s [658ms]
Average Time/Jar: 0.1096666666667s [109.6666666667ms]
*******************
:runners:direct-java:shadowJar (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.919 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 871845356925bc3579a3025bbc73f2d0
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 871845356925bc3579a3025bbc73f2d0
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 0.35 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is dd99e219afe04ca113fa9d38d238fd81
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key dd99e219afe04ca113fa9d38d238fd81
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.46 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.32 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 7d1e393fd53afd27e82a6538cef72458
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 7d1e393fd53afd27e82a6538cef72458
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.215 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.>
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.161 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main'.>
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package'.>
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Caching disabled for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 3.397s [3397ms]
Average Time/Jar: 0.2123125s [212.3125ms]
*******************
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 4.429 secs.
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :sdks:java:io:hadoop-format:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:hadoop-format:compileTestJava'.
Build cache key for task ':sdks:java:io:hadoop-format:compileTestJava' is ac779f18bee677d1907413826efbbc4c
Task ':sdks:java:io:hadoop-format:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:hadoop-format:compileTestJava' with cache key ac779f18bee677d1907413826efbbc4c
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 0.544 secs.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :sdks:java:io:hadoop-format:testClasses
Skipping task ':sdks:java:io:hadoop-format:testClasses' as it has no actions.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 11,5,main]) started.

> Task :sdks:java:io:hadoop-format:integrationTest
Custom actions are attached to task ':sdks:java:io:hadoop-format:integrationTest'.
Build cache key for task ':sdks:java:io:hadoop-format:integrationTest' is 4657d942fbae9efe46f8b606fe4a23c4
Task ':sdks:java:io:hadoop-format:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--numberOfRecords=600000","--bigQueryDataset=beam_performance","--bigQueryTable=hadoopformatioit_results","--influxMeasurement=hadoopformatioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=34.123.89.149","--postgresSsl=false","--postgresPort=5432","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.3/d90276fff414f06cb375f2057f6778cd63c6082f/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
    Nov 02, 2020 12:20:25 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #1 of 3 failed: The connection attempt failed..
    Nov 02, 2020 12:20:25 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Retrying in 2000 ms.
    Nov 02, 2020 12:20:37 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #2 of 3 failed: The connection attempt failed..
    Nov 02, 2020 12:20:37 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Retrying in 4000 ms.
    Nov 02, 2020 12:20:51 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #3 of 3 failed: The connection attempt failed..
    Nov 02, 2020 12:21:01 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #1 of 3 failed: The connection attempt failed..
    Nov 02, 2020 12:21:01 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Retrying in 2000 ms.
    Nov 02, 2020 12:21:03 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #2 of 3 failed: ERROR: table "beamtest_hadoopformatioit_2020_11_02_12_20_15_202" does not exist.
    Nov 02, 2020 12:21:03 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Retrying in 4000 ms.
    Nov 02, 2020 12:21:07 PM org.apache.beam.sdk.io.common.IOITHelper executeWithRetry
    WARNING: Attempt #3 of 3 failed: ERROR: table "beamtest_hadoopformatioit_2020_11_02_12_20_15_202" does not exist.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:45)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.createTable(HadoopFormatIOIT.java:132)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:87)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:67)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.setUp(HadoopFormatIOIT.java:127)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 13 more

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: ERROR: table "beamtest_hadoopformatioit_2020_11_02_12_20_15_202" does not exist
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
        at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:322)
        at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:308)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:284)
        at org.postgresql.jdbc.PgStatement.executeUpdate(PgStatement.java:258)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.deleteTable(DatabaseTestHelper.java:65)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.deleteTable(HadoopFormatIOIT.java:172)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:87)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:67)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.tearDown(HadoopFormatIOIT.java:168)

2 tests completed, 2 failed
Finished generating test XML results (0.015 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 11,5,main]) completed. Took 57.502 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 47s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/btf7mqgnqpqxy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2235/display/redirect>

Changes:


------------------------------------------
[...truncated 306.04 KB...]
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213474 bytes, hash 7965cb4fa08ecaa287d8c0c3c6c875423c9845e3cbe5752379bbf538e9352bd6> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-eWXLT6COyqKH2MDDxsh1QjyYRePL5XUjebv1OOk1K9Y.pb
    Nov 02, 2020 6:24:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 6:24:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_22_24_22-2064084361431052722?project=apache-beam-testing
    Nov 02, 2020 6:24:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_22_24_22-2064084361431052722
    Nov 02, 2020 6:24:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_22_24_22-2064084361431052722
    Nov 02, 2020 6:24:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T06:24:27.272Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-no32. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 02, 2020 6:24:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:37.764Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.455Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.582Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.608Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.747Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.779Z: Elided trivial flatten 
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.814Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.846Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.880Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.904Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.940Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:38.985Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.020Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.054Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.124Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.154Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.180Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.204Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.237Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.270Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.294Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.328Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.350Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.375Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.432Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.464Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.508Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.541Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.565Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.600Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.634Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.670Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.696Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.723Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.746Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.774Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:39.804Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 02, 2020 6:24:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.184Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.265Z: Starting 5 ****s in us-central1-f...
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.322Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.327Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.382Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.541Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:24:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:40.582Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 02, 2020 6:24:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:24:54.036Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 6:26:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:00.212Z: Workers have started successfully.
    Nov 02, 2020 6:26:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:04.010Z: Autoscaling: Raised the number of ****s to 2 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 6:26:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:04.079Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
    Nov 02, 2020 6:26:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:11.480Z: Workers have started successfully.
    Nov 02, 2020 6:26:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:19.863Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 6:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:27.859Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:26:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:36.314Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 02, 2020 6:26:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:36.382Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 02, 2020 6:26:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:36.438Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 02, 2020 6:26:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:36.548Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:26:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:46.897Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:26:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:47.061Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 02, 2020 6:26:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:47.117Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 02, 2020 6:26:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:47.262Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:26:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:50.069Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 6:26:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:50.141Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 02, 2020 6:26:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:50.196Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 02, 2020 6:26:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:50.268Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 02, 2020 6:26:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:54.037Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 02, 2020 6:26:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:54.181Z: Cleaning up.
    Nov 02, 2020 6:26:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:26:54.255Z: Stopping **** pool...
    Nov 02, 2020 6:28:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:28:44.658Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 02, 2020 6:28:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:28:44.708Z: Worker pool stopped.
    Nov 02, 2020 6:28:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_22_24_22-2064084361431052722 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 91375563-8b34-4783-a96f-56952c234458 and timestamp: 2020-11-02T06:28:50.513000000Z:
                     Metric:                    Value:
                   read_time                    14.258

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 8 mins 12.152 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 59s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/fgtzbi5whfvms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2234

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2234/display/redirect>

Changes:


------------------------------------------
[...truncated 306.94 KB...]
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 02, 2020 12:24:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213492 bytes, hash dee31080f1322a6c1e8a40ccfd36382df21c84bea1583bd0d5ec32b2efa9664e> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-3uMQgPEyKmweikDM_TY4LfIchL6hWDvQ1ewysu-pZk4.pb
    Nov 02, 2020 12:24:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 12:24:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_16_24_49-14442568941178007544?project=apache-beam-testing
    Nov 02, 2020 12:24:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_16_24_49-14442568941178007544
    Nov 02, 2020 12:24:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_16_24_49-14442568941178007544
    Nov 02, 2020 12:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T00:24:53.414Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-mge1. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:09.085Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:09.851Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:09.957Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:09.984Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.131Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.165Z: Elided trivial flatten 
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.198Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.233Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.277Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.300Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.331Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.356Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.388Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.422Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.444Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.470Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.501Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.525Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.574Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.599Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.624Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.645Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.667Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.701Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.725Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.747Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.769Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.805Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.836Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.859Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.892Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.924Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.958Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 02, 2020 12:25:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:10.993Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.024Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.058Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.091Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.580Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.614Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.657Z: Starting 5 ****s in us-central1-f...
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.712Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.727Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.842Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:11.881Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 02, 2020 12:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:12.961Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 12:25:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:35.378Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 12:25:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:35.408Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Nov 02, 2020 12:25:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:40.658Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 12:25:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:54.779Z: Workers have started successfully.
    Nov 02, 2020 12:25:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:25:54.812Z: Workers have started successfully.
    Nov 02, 2020 12:26:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:17.386Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:27.886Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 02, 2020 12:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:27.965Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 02, 2020 12:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:28.031Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 02, 2020 12:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:28.087Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:26:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:36.553Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:36.715Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 02, 2020 12:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:36.765Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 02, 2020 12:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:36.897Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:26:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:39.759Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 02, 2020 12:26:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:39.880Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 02, 2020 12:26:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:39.933Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 02, 2020 12:26:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:39.993Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 02, 2020 12:26:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:45.593Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 02, 2020 12:26:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:45.743Z: Cleaning up.
    Nov 02, 2020 12:26:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:26:45.830Z: Stopping **** pool...
    Nov 02, 2020 12:27:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:27:29.781Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 02, 2020 12:27:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:27:29.826Z: Worker pool stopped.
    Nov 02, 2020 12:27:36 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_16_24_49-14442568941178007544 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): d12f6433-4d49-479a-837e-4cc9c87b44df and timestamp: 2020-11-02T00:27:36.229000000Z:
                     Metric:                    Value:
                   read_time                     9.186

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 6 mins 26.468 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 12s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/ydgmy7jhx2u54

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2233

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2233/display/redirect>

Changes:


------------------------------------------
[...truncated 305.69 KB...]
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource) as step s1
    Nov 01, 2020 6:24:23 PM org.apache.hadoop.util.NativeCodeLoader <clinit>
    WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 01, 2020 6:24:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213487 bytes, hash ae10ad0be9b8d39031f1ea1749ffdd721fc6944b75859e8d36f15a38d4df5036> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-rhCtC-m405Ax8eoXSf_dch_GlEt1hZ6NNvFaONTfUDY.pb
    Nov 01, 2020 6:24:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 6:24:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_10_24_24-11588370775925669346?project=apache-beam-testing
    Nov 01, 2020 6:24:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_10_24_24-11588370775925669346
    Nov 01, 2020 6:24:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_10_24_24-11588370775925669346
    Nov 01, 2020 6:24:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T18:24:27.935Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-hd1s. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:42.720Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.362Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.486Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.547Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.701Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.730Z: Elided trivial flatten 
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.761Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.802Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.834Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.863Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.896Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.934Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.956Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:43.987Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.021Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.056Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.093Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.129Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.162Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.199Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.231Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.259Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.291Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.331Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.378Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.426Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.460Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.494Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.529Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.565Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.598Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.622Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.647Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.678Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.712Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.749Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:44.780Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.176Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.209Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.254Z: Starting 5 ****s in us-central1-f...
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.322Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.341Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.505Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:45.550Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 6:24:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:24:47.073Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 6:25:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:25:14.789Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 6:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:25:36.055Z: Workers have started successfully.
    Nov 01, 2020 6:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:25:36.110Z: Workers have started successfully.
    Nov 01, 2020 6:25:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:25:54.376Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:08.023Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 6:26:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:08.102Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 6:26:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:08.169Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 6:26:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:08.253Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:19.021Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:19.187Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 6:26:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:19.236Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 6:26:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:19.410Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:22.297Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:22.366Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 6:26:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:22.434Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 6:26:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:22.503Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 6:26:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:27.819Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 6:26:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:28.008Z: Cleaning up.
    Nov 01, 2020 6:26:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:26:28.092Z: Stopping **** pool...
    Nov 01, 2020 6:27:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:27:20.752Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 01, 2020 6:27:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:27:20.808Z: Worker pool stopped.
    Nov 01, 2020 6:27:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_10_24_24-11588370775925669346 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 5acd099b-f26a-474c-9e02-6c1a0aed75ec and timestamp: 2020-11-01T18:27:27.173000000Z:
                     Metric:                    Value:
                   read_time                      8.71

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 11,5,main]) completed. Took 6 mins 46.641 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 36s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/s2pyngxkht7c2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2232

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2232/display/redirect>

Changes:


------------------------------------------
[...truncated 304.89 KB...]
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213484 bytes, hash b9d401e91bb6df5e39d614c064b03a6e31df52d5ada54f1da63a6b387578678b> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-udQB6Ru231451hTAZLA6bjHfUtWtpU8dpjprOHV4Z4s.pb
    Nov 01, 2020 12:23:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 12:24:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_04_23_59-14501971798652188897?project=apache-beam-testing
    Nov 01, 2020 12:24:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_04_23_59-14501971798652188897
    Nov 01, 2020 12:24:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_04_23_59-14501971798652188897
    Nov 01, 2020 12:24:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T12:24:04.390Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-dsez. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 01, 2020 12:24:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:18.383Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 12:24:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.141Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 12:24:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.242Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 12:24:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.288Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.441Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.469Z: Elided trivial flatten 
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.504Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.524Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.552Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.590Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.615Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.639Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.683Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.717Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.752Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.787Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.812Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.843Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.878Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.913Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.948Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:19.983Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.015Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.047Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.073Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.109Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.140Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.169Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.203Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.228Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.275Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.306Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.330Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.364Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.399Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.418Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.450Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.850Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.883Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 12:24:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:20.941Z: Starting 5 ****s in us-central1-f...
    Nov 01, 2020 12:24:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:21.000Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 12:24:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:21.015Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 12:24:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:21.159Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:24:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:21.194Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 12:24:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:43.986Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 12:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:48.243Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:24:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:48.268Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Nov 01, 2020 12:24:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:24:53.533Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:07.645Z: Workers have started successfully.
    Nov 01, 2020 12:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:07.673Z: Workers have started successfully.
    Nov 01, 2020 12:25:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:27.283Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:39.829Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 12:25:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:39.918Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 12:25:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:39.986Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 12:25:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:40.066Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:48.474Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:48.630Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:48.688Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:48.836Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:50.845Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:50.912Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 12:25:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:50.980Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 12:25:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:51.055Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 12:25:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:55.583Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 12:25:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:55.730Z: Cleaning up.
    Nov 01, 2020 12:25:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:25:55.814Z: Stopping **** pool...
    Nov 01, 2020 12:26:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:26:49.316Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 01, 2020 12:26:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:26:49.387Z: Worker pool stopped.
    Nov 01, 2020 12:26:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_04_23_59-14501971798652188897 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 64e97438-accc-4ce4-b157-c6331505708a and timestamp: 2020-11-01T12:26:55.642000000Z:
                     Metric:                    Value:
                   read_time                     8.151

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

1 test completed, 1 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 6 mins 25.274 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 30s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/wupakcn3l4ng6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2231

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2231/display/redirect>

Changes:


------------------------------------------
[...truncated 307.18 KB...]
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource) as step s1
    Nov 01, 2020 6:24:33 AM org.apache.hadoop.util.NativeCodeLoader <clinit>
    WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213490 bytes, hash b3fefdefb4abf99d0298e9316c91facc16e41c22dc5d292e632bc353001f7316> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-s_7977Sr-Z0CmOkxbJH6zBbkHCLcXSkuYyvDUwAfcxY.pb
    Nov 01, 2020 6:24:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 6:24:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_23_24_34-651411586132320867?project=apache-beam-testing
    Nov 01, 2020 6:24:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_23_24_34-651411586132320867
    Nov 01, 2020 6:24:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_23_24_34-651411586132320867
    Nov 01, 2020 6:24:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T06:24:38.552Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-9hvf. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 01, 2020 6:24:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:51.834Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.457Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.573Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.606Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.756Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.781Z: Elided trivial flatten 
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.818Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.842Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.871Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.906Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.929Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:52.975Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.011Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.048Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.086Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.112Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.144Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.182Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.217Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.258Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.296Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.331Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.360Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.394Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.429Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.474Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.499Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.540Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.578Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.618Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.655Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.694Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.740Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.777Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.817Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.851Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:53.880Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.412Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.454Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.497Z: Starting 5 ****s in us-central1-f...
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.561Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.579Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.728Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:24:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:24:54.789Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 6:25:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:25:10.358Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 6:25:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:25:24.758Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 6:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:25:49.839Z: Workers have started successfully.
    Nov 01, 2020 6:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:25:49.862Z: Workers have started successfully.
    Nov 01, 2020 6:26:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:11.712Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:22.525Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 6:26:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:22.600Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 6:26:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:22.655Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 6:26:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:22.748Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:33.164Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:33.337Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 6:26:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:33.394Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 6:26:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:33.543Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:36.392Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 6:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:36.469Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 6:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:36.528Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 6:26:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:36.599Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 6:26:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:40.354Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 6:26:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:40.519Z: Cleaning up.
    Nov 01, 2020 6:26:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:26:40.606Z: Stopping **** pool...
    Nov 01, 2020 6:27:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:27:39.537Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 01, 2020 6:27:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:27:39.604Z: Worker pool stopped.
    Nov 01, 2020 6:27:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_23_24_34-651411586132320867 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 3a5348a4-2ae6-4a79-be1e-ef89bc902a0a and timestamp: 2020-11-01T06:27:49.517000000Z:
                     Metric:                    Value:
                   read_time                     7.723

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 6 mins 39.659 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 25s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/qjzhgunl4w3pi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2230

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2230/display/redirect>

Changes:


------------------------------------------
[...truncated 303.81 KB...]
    Nov 01, 2020 12:24:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource) as step s1
    Nov 01, 2020 12:24:12 AM org.apache.hadoop.util.NativeCodeLoader <clinit>
    WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213488 bytes, hash f265c3ab8a9901c19787ab63d913ec72a8d84c95075f2d35f617694414d853f2> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-8mXDq4qZAcGXh6tj2RPscqjYTJUHXy019hdpRBTYU_I.pb
    Nov 01, 2020 12:24:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 12:24:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_17_24_12-5797284074638853122?project=apache-beam-testing
    Nov 01, 2020 12:24:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_17_24_12-5797284074638853122
    Nov 01, 2020 12:24:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_17_24_12-5797284074638853122
    Nov 01, 2020 12:24:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T00:24:16.684Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-110-lya9. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 01, 2020 12:24:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:31.700Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 12:24:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.408Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 12:24:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.524Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 12:24:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.557Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 12:24:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.704Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.742Z: Elided trivial flatten 
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.777Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.812Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.847Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.882Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.916Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.951Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:32.979Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.033Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.068Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.101Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.138Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.170Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.207Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.250Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.295Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.326Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.359Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.386Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.415Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.444Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.492Z: Fusing consumer Get values only/Values/Map into Collect read time
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.525Z: Fusing consumer Values as string into Get values only/Values/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.559Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.596Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.628Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.654Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.678Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.687Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.707Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.741Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.767Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:33.799Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.180Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.249Z: Starting 5 ****s in us-central1-f...
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.292Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.314Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.373Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.485Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:24:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:34.526Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 12:24:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:24:58.771Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:25:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:18.324Z: Workers have started successfully.
    Nov 01, 2020 12:25:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:18.352Z: Workers have started successfully.
    Nov 01, 2020 12:25:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:38.329Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:49.996Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Nov 01, 2020 12:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:50.075Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 12:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:50.127Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Nov 01, 2020 12:25:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:50.206Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:58.386Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:25:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:58.511Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 12:25:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:58.569Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Nov 01, 2020 12:25:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:25:58.698Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:26:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:01.594Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Nov 01, 2020 12:26:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:01.671Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 12:26:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:01.722Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Nov 01, 2020 12:26:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:01.802Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 12:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:06.633Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Nov 01, 2020 12:26:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:06.806Z: Cleaning up.
    Nov 01, 2020 12:26:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:06.893Z: Stopping **** pool...
    Nov 01, 2020 12:26:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:57.047Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 01, 2020 12:26:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:26:57.087Z: Worker pool stopped.
    Nov 01, 2020 12:27:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_17_24_12-5797284074638853122 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): cb051008-8129-4a27-9dd0-43f4337f080d and timestamp: 2020-11-01T00:27:02.477000000Z:
                     Metric:                    Value:
                   read_time                     7.682

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 6 mins 49.26 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/cl2xdddnux5wo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2229

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2229/display/redirect>

Changes:


------------------------------------------
[...truncated 308.03 KB...]
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 31, 2020 6:24:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213489 bytes, hash 66d4b9b8ff9723191e6e938ca6224cc21d0e8a3ac0c404f008c50a51e49f8b3c> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-ZtS5uP-XIxkebpOMpiJMwh0OijrAxATwCMUKUeSfizw.pb
    Oct 31, 2020 6:25:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 6:25:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_11_25_00-6379154928927139346?project=apache-beam-testing
    Oct 31, 2020 6:25:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_11_25_00-6379154928927139346
    Oct 31, 2020 6:25:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_11_25_00-6379154928927139346
    Oct 31, 2020 6:25:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T18:25:04.513Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-103-yzs4. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Oct 31, 2020 6:25:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:16.938Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.481Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.706Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.737Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.898Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.937Z: Elided trivial flatten 
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.974Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:17.998Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.032Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.101Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.131Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.166Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.201Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.238Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.273Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.309Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.346Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.379Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.408Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.439Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.474Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.498Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.536Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.570Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.604Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.671Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.699Z: Fusing consumer Get values only/Values/Map into Collect read time
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.723Z: Fusing consumer Values as string into Get values only/Values/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.754Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.788Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.820Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.847Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.880Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.906Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.938Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:18.974Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.010Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.389Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.424Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.472Z: Starting 5 ****s in us-central1-f...
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.530Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.555Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.690Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:25:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:19.730Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 6:25:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:34.079Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 6:25:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:43.703Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:25:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:43.731Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Oct 31, 2020 6:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:25:49.022Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:26:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:05.654Z: Workers have started successfully.
    Oct 31, 2020 6:26:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:05.695Z: Workers have started successfully.
    Oct 31, 2020 6:26:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:29.402Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:40.092Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 6:26:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:40.166Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 6:26:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:40.224Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 6:26:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:40.461Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:48.794Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:48.939Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 6:26:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:49.001Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 6:26:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:49.159Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:52.134Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 6:26:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:52.191Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 6:26:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:52.246Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 6:26:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:52.317Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 6:27:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:57.984Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 6:27:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:58.142Z: Cleaning up.
    Oct 31, 2020 6:27:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:26:58.225Z: Stopping **** pool...
    Oct 31, 2020 6:27:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:27:48.710Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 31, 2020 6:27:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:27:48.767Z: Worker pool stopped.
    Oct 31, 2020 6:27:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_11_25_00-6379154928927139346 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 42f220bc-9e03-45c1-9040-8f47ee1b5512 and timestamp: 2020-10-31T18:27:54.862000000Z:
                     Metric:                    Value:
                   read_time                    10.817

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

1 test completed, 1 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 6 mins 31.885 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 20s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/nlhxik2y4nebs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #2228

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2228/display/redirect>

Changes:


------------------------------------------
[...truncated 304.69 KB...]
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Collect read time as step s2
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Get values only/Values/Map as step s3
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Values as string as step s4
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/WithKeys/AddKeys/Map as step s5
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/GroupByKey as step s6
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues as step s7
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Values/Values/Map as step s8
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as step s9
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/View.AsIterable/CreateDataflowView as step s10
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/CreateVoid/Read(CreateSource) as step s11
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/ProduceDefault as step s12
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Calculate hashcode/Flatten.PCollections as step s13
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s14
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s15
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s16
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s17
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s18
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s19
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s20
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s21
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s22
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s23
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s24
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s25
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 31, 2020 12:23:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <213483 bytes, hash 933da00eac8d4cc826c217dbf4cb28c594b3d4e130514ead19cbc1381dc72d44> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-kz2gDqyNTMgmwhfb9MsoxZSz1OEwUU6tGcvBOB3HLUQ.pb
    Oct 31, 2020 12:23:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 12:23:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_05_23_57-4510067915463930254?project=apache-beam-testing
    Oct 31, 2020 12:23:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_05_23_57-4510067915463930254
    Oct 31, 2020 12:23:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_05_23_57-4510067915463930254
    Oct 31, 2020 12:24:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T12:24:01.627Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: hadoopformatioit0writeandreadusinghadoopformat-jenkins-103-zmiy. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Oct 31, 2020 12:24:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:15.357Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:17.542Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.091Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.204Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.237Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.378Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.408Z: Elided trivial flatten 
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.440Z: Unzipping flatten s17 for input s15.org.apache.beam.sdk.values.PCollection.<init>:400#d928e938ba0454ea
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.469Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.513Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:400#805b93b0a398202d
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.566Z: Fusing unzipped copy of PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous), through flatten Calculate hashcode/Flatten.PCollections, into producer Calculate hashcode/ProduceDefault
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.599Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.624Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.651Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.684Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.709Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.739Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.762Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.794Z: Unzipping flatten s17-u31 for input s19.org.apache.beam.sdk.values.PCollection.<init>:400#f0f805bd79288b-c29
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.820Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.844Z: Unzipping flatten s13-u36 for input s14.org.apache.beam.sdk.values.PCollection.<init>:400#4f24fdb6be16210b-c34
    Oct 31, 2020 12:24:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.881Z: Fusing unzipped copy of PAssert$0/GroupGlobally/ParDo(ToSingletonIterables), through flatten Calculate hashcode/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.912Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.945Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.967Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:18.999Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.033Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.061Z: Fusing consumer Get values only/Values/Map into Collect read time
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.084Z: Fusing consumer Values as string into Get values only/Values/Map
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.110Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.159Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.193Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.221Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.245Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.280Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.315Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.361Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.389Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.717Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.754Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.800Z: Starting 5 ****s in us-central1-f...
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.860Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Create
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.877Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:19.994Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:20.037Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 12:24:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:45.352Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 12:24:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:45.378Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Oct 31, 2020 12:24:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:24:50.681Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 12:25:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:07.301Z: Workers have started successfully.
    Oct 31, 2020 12:25:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:07.324Z: Workers have started successfully.
    Oct 31, 2020 12:25:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:27.409Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:37.745Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Oct 31, 2020 12:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:37.803Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 12:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:37.856Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Oct 31, 2020 12:25:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:37.928Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:25:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:46.242Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:25:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:46.375Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 12:25:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:46.426Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Oct 31, 2020 12:25:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:46.589Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:48.347Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)+PAssert$0/GroupGlobally/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GroupByKey/Reify+PAssert$0/GroupGlobally/GroupByKey/Write
    Oct 31, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:48.429Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:48.482Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Close
    Oct 31, 2020 12:25:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:48.581Z: Executing operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 12:25:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:53.182Z: Finished operation PAssert$0/GroupGlobally/GroupByKey/Read+PAssert$0/GroupGlobally/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Oct 31, 2020 12:25:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:53.318Z: Cleaning up.
    Oct 31, 2020 12:25:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:25:53.393Z: Stopping **** pool...
    Oct 31, 2020 12:26:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:26:37.253Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 31, 2020 12:26:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:26:37.298Z: Worker pool stopped.
    Oct 31, 2020 12:26:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_05_23_57-4510067915463930254 finished with status DONE.

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_OUT
    Load test results for test (ID): 8f9d09a9-abd7-4925-8d44-d3315fb198f8 and timestamp: 2020-10-31T12:26:45.155000000Z:
                     Metric:                    Value:
                   read_time                     6.886

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat FAILED
    java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isNoneBlank([Ljava/lang/CharSequence;)Z
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithCheck(InfluxDBPublisher.java:71)
        at org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher.publishWithSettings(InfluxDBPublisher.java:65)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:87)
        at org.apache.beam.sdk.testutils.metrics.IOITMetrics.publishToInflux(IOITMetrics.java:77)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.collectAndPublishMetrics(HadoopFormatIOIT.java:226)
        at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:210)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 6 mins 34.831 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 23s
95 actionable tasks: 57 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/i2hkhj7u4keie

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org