You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/10/01 14:45:16 UTC

Build failed in Jenkins: beam_PerformanceTests_Cdap #234

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/234/display/redirect?page=changes>

Changes:

[noreply] JdbcIO fetchSize can be set to Integer.MIN_VALUE (#23444)


------------------------------------------
[...truncated 2.38 MB...]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Prevent fusion before writing/Reshuffle, transform=ReshuffleOverrideFactory.ReshuffleWithOnlyTrigger}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Prevent fusion before writing/Reshuffle/Window.Into(), transform=Window.Into()}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign, transform=Window.Assign}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign.out [PCollection@1408538096]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Prevent fusion before writing/Reshuffle/GroupByKey, transform=GroupByKey}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent fusion before writing/Reshuffle/GroupByKey.out [PCollection@1165000566]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Prevent fusion before writing/Reshuffle/ExpandIterable, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent fusion before writing/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output [PCollection@1202547191]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Prevent fusion before writing/Values, transform=Values}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Prevent fusion before writing/Values/Values, transform=MapElements}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Prevent fusion before writing/Values/Values/Map, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent fusion before writing/Values/Values/Map/ParMultiDo(Anonymous).output [PCollection@881977454]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Collect write time, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Collect write time/ParMultiDo(TimeMonitor).output [PCollection@2113891589]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Construct rows for DBOutputFormat, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Construct rows for DBOutputFormat/ParMultiDo(ConstructDBOutputFormatRow).output [PCollection@1568159144]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO, transform=CdapIO.Write}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write, transform=HadoopFormatIO.Write}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig, transform=Create.Values}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource), transform=Read(CreateSource)}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper).output [PCollection@1500079441]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton, transform=View.AsSingleton}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView, transform=BatchViewAsSingleton}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton), transform=Combine.globally(Singleton)}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys, transform=WithKeys}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys, transform=MapElements}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous).output [PCollection@828070163]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton), transform=Combine.perKey(Singleton)}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey, transform=GroupByKey}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey.out [PCollection@1124047479]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues, transform=DataflowRunner.CombineGroupedValues}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output [PCollection@152426436]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values, transform=Values}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values, transform=MapElements}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous).output [PCollection@599590015]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey, transform=BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey), transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)/ParMultiDo(UseWindowHashAsKeyAndWindowAsSortKey).output [PCollection@387366967]
    14:42:11.420 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly, transform=BatchViewOverrides.GroupByKeyAndSortValuesOnly}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly.out [PCollection@1887922615]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow), transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)/ParMultiDo(IsmRecordForSingularValuePerWindow).output [PCollection@549696331]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView, transform=CreateDataflowView}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output [PCollection@1901018532]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob), transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)/ParMultiDo(SetupJob).output [PCollection@925024581]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition, transform=HadoopFormatIO.GroupDataByPartition}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask/ParMultiDo(AssignTask).output [PCollection@2005293363]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId, transform=GroupByKey}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId.out [PCollection@1277882374]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks/ParMultiDo(FlattenGroupedTasks).output [PCollection@325674467]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/Write, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/Write/ParMultiDo(Write).output [PCollection@987255094]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks, transform=Combine.globally(IterableCombiner)}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys, transform=WithKeys}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys, transform=MapElements}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map/ParMultiDo(Anonymous).output [PCollection@1117747481]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner), transform=Combine.perKey(IterableCombiner)}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey, transform=GroupByKey}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey.out [PCollection@2038185019]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues, transform=DataflowRunner.CombineGroupedValues}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output [PCollection@913148823]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values, transform=Values}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values, transform=MapElements}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map/ParMultiDo(Anonymous).output [PCollection@256522893]
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob, transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:42:11.421 [Test ****] DEBUG org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob/ParMultiDo(CommitJob).output [PCollection@881513107]
    14:42:13.337 [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:12.916Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: cdapioit0testcdapioreadsandwritescorrectlyinbatch-jenkins--c87r. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    14:42:27.752 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:26.783Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    14:42:33.678 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:30.829Z: Worker configuration: e2-standard-2 in us-central1-a.
    14:42:33.678 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.001Z: Expanding CoGroupByKey operations into optimizable parts.
    14:42:33.679 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.064Z: Combiner lifting skipped for step Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId: GroupByKey not followed by a combiner.
    14:42:33.679 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.093Z: Combiner lifting skipped for step Prevent fusion before writing/Reshuffle/GroupByKey: GroupByKey not followed by a combiner.
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.184Z: Expanding GroupByKey operations into optimizable parts.
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.213Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    14:42:33.679 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.342Z: Annotating graph with Autotuner information.
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.384Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.421Z: Fusing consumer Produce db rows into Generate sequence/Read(BoundedCountingSource)
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.455Z: Fusing consumer Prevent fusion before writing/Pair with random key into Produce db rows
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.491Z: Fusing consumer Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign into Prevent fusion before writing/Pair with random key
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.525Z: Fusing consumer Prevent fusion before writing/Reshuffle/GroupByKey/Reify into Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.559Z: Fusing consumer Prevent fusion before writing/Reshuffle/GroupByKey/Write into Prevent fusion before writing/Reshuffle/GroupByKey/Reify
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.586Z: Fusing consumer Prevent fusion before writing/Reshuffle/GroupByKey/GroupByWindow into Prevent fusion before writing/Reshuffle/GroupByKey/Read
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.619Z: Fusing consumer Prevent fusion before writing/Reshuffle/ExpandIterable into Prevent fusion before writing/Reshuffle/GroupByKey/GroupByWindow
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.654Z: Fusing consumer Prevent fusion before writing/Values/Values/Map into Prevent fusion before writing/Reshuffle/ExpandIterable
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.686Z: Fusing consumer Collect write time into Prevent fusion before writing/Values/Values/Map
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.718Z: Fusing consumer Construct rows for DBOutputFormat into Collect write time
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.743Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob) into Construct rows for DBOutputFormat
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.768Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask into Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.801Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify into Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.835Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write into Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.860Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow into Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.892Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks into Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.919Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/Write into Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.946Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map into Write using CdapIO/HadoopFormatIO.Write/Write
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.975Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:32.996Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial
    14:42:33.679 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.017Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.047Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.072Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.096Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.131Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob into Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.154Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.187Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.210Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.236Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.262Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.282Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.303Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.335Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.368Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    14:42:33.680 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.390Z: Fusing consumer Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    14:42:33.680 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.433Z: Workflow config is missing a default resource spec.
    14:42:33.680 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.453Z: Adding StepResource setup and teardown to workflow graph.
    14:42:33.680 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.487Z: Adding workflow start and stop steps.
    14:42:33.680 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.523Z: Assigning stage ids.
    14:42:35.782 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.680Z: Executing wait step start68
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.761Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.799Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Create
    14:42:35.782 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.800Z: Starting **** pool setup.
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:33.822Z: Starting 5 ****s in us-central1-a...
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.211Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.213Z: Finished operation Prevent fusion before writing/Reshuffle/GroupByKey/Create
    14:42:35.782 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.308Z: Value "Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Session" materialized.
    14:42:35.782 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.342Z: Value "Prevent fusion before writing/Reshuffle/GroupByKey/Session" materialized.
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.386Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    14:42:35.782 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:42:34.435Z: Executing operation Generate sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before writing/Pair with random key+Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before writing/Reshuffle/GroupByKey/Reify+Prevent fusion before writing/Reshuffle/GroupByKey/Write
    14:43:15.986 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:43:15.576Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    14:43:45.343 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:43:45.246Z: Workers have started successfully.
    14:44:15.196 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:14.826Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    14:44:15.196 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:14.895Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    14:44:15.196 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:14.960Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    14:44:15.196 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:15.039Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    14:44:17.281 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:15.205Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    14:44:17.281 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:15.272Z: Value "Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Session" materialized.
    14:44:17.281 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:15.343Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    14:44:19.007 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:18.088Z: Finished operation Generate sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before writing/Pair with random key+Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before writing/Reshuffle/GroupByKey/Reify+Prevent fusion before writing/Reshuffle/GroupByKey/Write
    14:44:19.007 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:18.158Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Close
    14:44:19.007 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:18.211Z: Finished operation Prevent fusion before writing/Reshuffle/GroupByKey/Close
    14:44:19.007 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:18.908Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    14:44:19.007 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:18.973Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    14:44:21.474 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:19.033Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    14:44:21.474 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:19.089Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.228Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    14:44:24.527 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.300Z: Value "Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow).out0" materialized.
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.370Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.439Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    14:44:24.527 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.500Z: Value "Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView.out0" materialized.
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.582Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Create
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.734Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Create
    14:44:24.527 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.802Z: Value "Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Session" materialized.
    14:44:24.527 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:22.887Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Read+Prevent fusion before writing/Reshuffle/GroupByKey/GroupByWindow+Prevent fusion before writing/Reshuffle/ExpandIterable+Prevent fusion before writing/Values/Values/Map+Collect write time+Construct rows for DBOutputFormat+Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write
    14:44:36.102 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:35.578Z: Finished operation Prevent fusion before writing/Reshuffle/GroupByKey/Read+Prevent fusion before writing/Reshuffle/GroupByKey/GroupByWindow+Prevent fusion before writing/Reshuffle/ExpandIterable+Prevent fusion before writing/Values/Values/Map+Collect write time+Construct rows for DBOutputFormat+Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write
    14:44:36.102 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:35.644Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Close
    14:44:36.102 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:35.695Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Close
    14:44:36.102 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:35.763Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
    14:44:36.102 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:35.943Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
    14:44:36.102 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:36.008Z: Value "Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Session" materialized.
    14:44:37.384 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:36.087Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks+Write using CdapIO/HadoopFormatIO.Write/Write+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
    14:44:48.713 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:47.359Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow+Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks+Write using CdapIO/HadoopFormatIO.Write/Write+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
    14:44:48.713 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:47.411Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Close
    14:44:51.359 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:48.841Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Close
    14:44:51.359 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:48.906Z: Executing operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map+Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob
    14:44:51.360 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:50.407Z: Finished operation Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract+Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map+Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob
    14:44:51.360 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:50.502Z: Executing success step success66
    14:44:51.360 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:50.583Z: Cleaning up.
    14:44:51.360 [Test ****] DEBUG org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:50.625Z: Starting **** pool teardown.
    14:44:51.360 [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2022-10-01T14:44:50.653Z: Stopping **** pool...
    14:45:13.006 [Thread-7] WARN org.apache.beam.runners.dataflow.DataflowPipelineJob - Job is already running in Google Cloud Platform, Ctrl-C will not cancel it.
    To cancel the job in the cloud, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-10-01_07_42_08-5706053843353646199

org.apache.beam.sdk.io.cdap.CdapIOIT > testCdapIOReadsAndWritesCorrectlyInBatch SKIPPED

> Task :sdks:java:io:cdap:integrationTest FAILED
:sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 4,5,main]) completed. Took 3 mins 58.818 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:cdap:integrationTest'.
> Process 'Gradle Test Executor 4' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
142 actionable tasks: 84 executed, 56 from cache, 2 up-to-date

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=4ccfc780-bd43-49f8-ad8d-b7a08b352bd0, currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 2321936
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-2321936.out.log
----- Last  20 lines from daemon log file - daemon-2321936.out.log -----
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
142 actionable tasks: 84 executed, 56 from cache, 2 up-to-date

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Cdap #236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/236/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Cdap #235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/235/display/redirect>

Changes:


------------------------------------------
[...truncated 333.10 KB...]
Resolve mutations for :sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 7,5,main]) started.

> Task :sdks:java:extensions:protobuf:testClasses
Skipping task ':sdks:java:extensions:protobuf:testClasses' as it has no actions.
:sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 7,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 5,5,main]) started.

> Task :sdks:java:io:cdap:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:cdap:compileTestJava'.
Build cache key for task ':sdks:java:io:cdap:compileTestJava' is 1103da5a127e883e2a363bfeb0dd8743
Task ':sdks:java:io:cdap:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:cdap:compileTestJava' with cache key 1103da5a127e883e2a363bfeb0dd8743
:sdks:java:io:cdap:compileTestJava (Thread[Execution **** Thread 4,5,main]) completed. Took 0.243 secs.
Resolve mutations for :sdks:java:io:cdap:testClasses (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :sdks:java:io:cdap:testClasses (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:cdap:testClasses (Thread[Execution **** Thread 4,5,main]) started.

> Task :sdks:java:io:cdap:testClasses
Skipping task ':sdks:java:io:cdap:testClasses' as it has no actions.
:sdks:java:io:cdap:testClasses (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:extensions:protobuf:testJar
Caching disabled for task ':sdks:java:extensions:protobuf:testJar' because:
  Not worth caching
Task ':sdks:java:extensions:protobuf:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.067 secs.
work action resolve beam-sdks-java-extensions-protobuf-tests.jar (project :sdks:java:extensions:protobuf) (Thread[Execution **** Thread 5,5,main]) started.
work action null (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 18fb4d7c100224cb06ef587f01ab1760
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 18fb4d7c100224cb06ef587f01ab1760
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[included builds,5,main]) completed. Took 0.362 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is cf939a8e73bacd4cee6ced7e1c9ce8c2
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key cf939a8e73bacd4cee6ced7e1c9ce8c2
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.35 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[included builds,5,main]) started.
work action null (Thread[included builds,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 131209ca67d467a7040c775282cdee1a
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 131209ca67d467a7040c775282cdee1a
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution ****,5,main]) completed. Took 0.512 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[included builds,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[included builds,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 5,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[included builds,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Not worth caching
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[included builds,5,main]) completed. Took 0.294 secs.
work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project :sdks:java:io:google-cloud-platform) (Thread[Execution ****,5,main]) started.
work action null (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is f8d50c46088f3d6c04ab90450807bfdb
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key f8d50c46088f3d6c04ab90450807bfdb
:runners:google-cloud-dataflow-java:compileTestJava (Thread[included builds,5,main]) completed. Took 0.288 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 4,5,main]) completed. Took 0.028 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[included builds,5,main]) started.
work action null (Thread[included builds,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 4,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 55 started executing tests.

> Task :sdks:java:io:cdap:integrationTest
Custom actions are attached to task ':sdks:java:io:cdap:integrationTest'.
Build cache key for task ':sdks:java:io:cdap:integrationTest' is 169de168775f23ebc8760e04c66e458b
Task ':sdks:java:io:cdap:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 55'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/sdks/java/io/cdap> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--numberOfRecords=600000","--bigQueryDataset=beam_performance","--bigQueryTable=cdapioit_results","--influxMeasurement=cdapioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=35.188.98.0","--postgresSsl=false","--postgresPort=5432","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/sdks/java/io/cdap/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 55'
Successfully started process 'Gradle Test Executor 55'

org.apache.beam.sdk.io.cdap.CdapIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.2.8/22d21c4dfc77adf6f2f24bf3991846792de50b48/logback-classic-1.2.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-reload4j/1.7.36/db708f7d959dee1857ac524636e85ecf2e1781c1/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]

org.apache.beam.sdk.io.cdap.CdapIOIT STANDARD_OUT
    20:41:22.624 [Test ****] DEBUG org.apache.beam.sdk.options.PipelineOptionsFactory - Provided Arguments: {tempRoot=[gs://temp-storage-for-perf-tests], project=[apache-beam-testing], runner=[DataflowRunner], numberOfRecords=[600000], bigQueryDataset=[beam_performance], bigQueryTable=[cdapioit_results], influxMeasurement=[cdapioit_results], influxDatabase=[beam_test_metrics], influxHost=[http://10.128.0.96:8086], postgresUsername=[postgres], postgresPassword=[uuinkks], postgresDatabaseName=[postgres], postgresServerName=[35.188.98.0], postgresSsl=[false], postgresPort=[5432], numWorkers=[5], autoscalingAlgorithm=[NONE], ****HarnessContainerImage=[], dataflowWorkerJar=[<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar],> region=[us-central1]}
    20:41:32.902 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #1 of 3 failed: The connection attempt failed..
    20:41:32.902 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Retrying in 2000 ms.
    20:41:44.911 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #2 of 3 failed: The connection attempt failed..
    20:41:44.911 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Retrying in 4000 ms.
    20:41:58.922 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #3 of 3 failed: The connection attempt failed..
    20:41:59.139 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #1 of 3 failed: ERROR: table "beamtest_cdapioit_2022_10_01_20_41_22_855" does not exist.
    20:41:59.139 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Retrying in 2000 ms.
    20:42:01.180 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #2 of 3 failed: ERROR: table "beamtest_cdapioit_2022_10_01_20_41_22_855" does not exist.
    20:42:01.180 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Retrying in 4000 ms.
    20:42:05.221 [Test ****] WARN org.apache.beam.sdk.io.common.IOITHelper - Attempt #3 of 3 failed: ERROR: table "beamtest_cdapioit_2022_10_01_20_41_22_855" does not exist.

org.apache.beam.sdk.io.cdap.CdapIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:89)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:97)
        at org.apache.beam.sdk.io.cdap.CdapIOIT.createTable(CdapIOIT.java:230)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:86)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:66)
        at org.apache.beam.sdk.io.cdap.CdapIOIT.setup(CdapIOIT.java:115)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 14 more

Gradle Test Executor 55 finished executing tests.

> Task :sdks:java:io:cdap:integrationTest FAILED

org.apache.beam.sdk.io.cdap.CdapIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: ERROR: table "beamtest_cdapioit_2022_10_01_20_41_22_855" does not exist
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
        at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:322)
        at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:308)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:284)
        at org.postgresql.jdbc.PgStatement.executeUpdate(PgStatement.java:258)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.deleteTable(DatabaseTestHelper.java:107)
        at org.apache.beam.sdk.io.cdap.CdapIOIT.deleteTable(CdapIOIT.java:234)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:86)
        at org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:66)
        at org.apache.beam.sdk.io.cdap.CdapIOIT.tearDown(CdapIOIT.java:120)

2 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/sdks/java/io/cdap/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/sdks/java/io/cdap/build/reports/tests/integrationTest>
:sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 4,5,main]) completed. Took 46.705 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:cdap:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src/sdks/java/io/cdap/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57s
142 actionable tasks: 84 executed, 56 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/uyiihrldgaaw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org