You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/01/27 01:45:10 UTC

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #3580

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3580/display/redirect?page=changes>

Changes:

[nflavour] Initial commit of boilerplate setup of change stream pipeline for

[ahmedabualsaud] Fix SchemaTransform identifiers

[noreply] Update chromedriver-binary requirement in /sdks/python (#25178)

[noreply] Bump google.golang.org/grpc from 1.52.0 to 1.52.3 in /sdks (#25181)

[noreply] Fix a couple typos caught by an internal linter (#25188)

[noreply] Update Dataflow container versions (#25192)


------------------------------------------
[...truncated 953.61 KB...]
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s16
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s17
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as step s18
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s19
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s20
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s21
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s22
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s23
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s24
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s25
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s26
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s27
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.46.0-SNAPSHOT
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-26_17_15_07-11351705346048958909?project=apache-beam-testing
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2023-01-26_17_15_07-11351705346048958909
    Jan 27, 2023 1:15:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2023-01-26_17_15_07-11351705346048958909
    Jan 27, 2023 1:15:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2023-01-27T01:15:10.716Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: kafkaioit0testkafkaioreadsandwritescorrectlyinbatch-jenkin-yd73. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Jan 27, 2023 1:15:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:23.707Z: Worker configuration: e2-standard-2 in us-central1-b.
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.140Z: Expanding CoGroupByKey operations into optimizable parts.
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.269Z: Expanding GroupByKey operations into optimizable parts.
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.299Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.472Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.494Z: Elided trivial flatten 
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.526Z: Unzipping flatten s19 for input s17.org.apache.beam.sdk.values.PCollection.<init>:405#e63ad18dfc139ac0
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.551Z: Fusing unzipped copy of PAssert$0/GroupGlobally/WithKeys/AddKeys/Map, through flatten PAssert$0/GroupGlobally/Flatten.PCollections, into producer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables)
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.585Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Write into PAssert$0/GroupGlobally/GroupByKey/Reify
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.618Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GroupByKey/Read
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.649Z: Fusing consumer PAssert$0/GroupGlobally/Values/Values/Map into PAssert$0/GroupGlobally/GroupByKey/GroupByWindow
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.685Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(Concat) into PAssert$0/GroupGlobally/Values/Values/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.720Z: Fusing consumer PAssert$0/GetPane/Map into PAssert$0/GroupGlobally/ParDo(Concat)
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.798Z: Fusing consumer PAssert$0/RunChecks into PAssert$0/GetPane/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.833Z: Fusing consumer PAssert$0/VerifyAssertions/ParDo(DefaultConclude) into PAssert$0/RunChecks
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.867Z: Unzipping flatten s19-u40 for input s21.org.apache.beam.sdk.values.PCollection.<init>:405#24aa17775bf242ac-c38
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.900Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GroupByKey/Reify, through flatten PAssert$0/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.933Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:25.969Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.004Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.036Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.068Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.089Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.121Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.151Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.183Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.207Z: Fusing consumer Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.238Z: Fusing consumer Measure read time into Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.271Z: Fusing consumer Map records to strings/Map into Measure read time
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.297Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Map records to strings/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.333Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.369Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.402Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.434Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.464Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.499Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.521Z: Fusing consumer PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) into Calculate hashcode/Values/Values/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.553Z: Fusing consumer PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.584Z: Fusing consumer PAssert$0/GroupGlobally/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.606Z: Fusing consumer PAssert$0/GroupGlobally/GroupByKey/Reify into PAssert$0/GroupGlobally/WithKeys/AddKeys/Map
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.925Z: Executing operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:26.990Z: Starting 5 ****s in us-central1-b...
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:27.226Z: Finished operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create
    Jan 27, 2023 1:15:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:27.376Z: Executing operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write
    Jan 27, 2023 1:15:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:15:28.911Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jan 27, 2023 1:16:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:16:01.482Z: Autoscaling: Raised the number of ****s to 1 based on the rate of progress in the currently running stage(s).
    Jan 27, 2023 1:16:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:16:01.513Z: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
    Jan 27, 2023 1:16:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:16:10.942Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Jan 27, 2023 1:16:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:16:41.414Z: Workers have started successfully.
    Jan 27, 2023 1:17:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:03.716Z: All ****s have finished the startup processes and began to receive work requests.
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:08.975Z: Finished operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Read(CreateSource)+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:09.039Z: Executing operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:09.097Z: Finished operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:09.175Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:09.308Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Jan 27, 2023 1:17:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-27T01:17:09.442Z: Executing operation Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read+Read from bounded Kafka/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds+Measure read time+Map records to strings/Map+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Jan 27, 2023 1:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING
    Jan 27, 2023 1:45:08 AM org.apache.kafka.common.config.AbstractConfig logAll
    INFO: AdminClientConfig values: 
    	bootstrap.servers = [34.133.19.41:32401, 34.123.175.172:32402, 34.171.24.82:32403]
    	client.dns.lookup = default
    	client.id = 
    	connections.max.idle.ms = 300000
    	metadata.max.age.ms = 300000
    	metric.reporters = []
    	metrics.num.samples = 2
    	metrics.recording.level = INFO
    	metrics.sample.window.ms = 30000
    	receive.buffer.bytes = 65536
    	reconnect.backoff.max.ms = 1000
    	reconnect.backoff.ms = 50
    	request.timeout.ms = 120000
    	retries = 5
    	retry.backoff.ms = 100
    	sasl.client.callback.handler.class = null
    	sasl.jaas.config = null
    	sasl.kerberos.kinit.cmd = /usr/bin/kinit
    	sasl.kerberos.min.time.before.relogin = 60000
    	sasl.kerberos.service.name = null
    	sasl.kerberos.ticket.renew.jitter = 0.05
    	sasl.kerberos.ticket.renew.window.factor = 0.8
    	sasl.login.callback.handler.class = null
    	sasl.login.class = null
    	sasl.login.refresh.buffer.seconds = 300
    	sasl.login.refresh.min.period.seconds = 60
    	sasl.login.refresh.window.factor = 0.8
    	sasl.login.refresh.window.jitter = 0.05
    	sasl.mechanism = GSSAPI
    	security.protocol = PLAINTEXT
    	security.providers = null
    	send.buffer.bytes = 131072
    	ssl.cipher.suites = null
    	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    	ssl.endpoint.identification.algorithm = https
    	ssl.key.password = null
    	ssl.keymanager.algorithm = SunX509
    	ssl.keystore.location = null
    	ssl.keystore.password = null
    	ssl.keystore.type = JKS
    	ssl.protocol = TLS
    	ssl.provider = null
    	ssl.secure.random.implementation = null
    	ssl.trustmanager.algorithm = PKIX
    	ssl.truststore.location = null
    	ssl.truststore.password = null
    	ssl.truststore.type = JKS

    Jan 27, 2023 1:45:08 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka version: 2.4.1
    Jan 27, 2023 1:45:08 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka commitId: c57222ae8cd7866b
    Jan 27, 2023 1:45:08 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka startTimeMs: 1674783908173

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInBatch FAILED
    java.lang.AssertionError: expected:<DONE> but was:<null>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:120)
        at org.junit.Assert.assertEquals(Assert.java:146)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInBatch(KafkaIOIT.java:275)

1 test completed, 1 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** Thread 7,5,main]) completed. Took 37 mins 21.492 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 37m 45s
137 actionable tasks: 1 executed, 136 up-to-date

Publishing build scan...
https://gradle.com/s/lb7azua255h3w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Kafka_IO #3582

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3582/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #3581

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3581/display/redirect>

Changes:


------------------------------------------
[...truncated 9.31 KB...]
+ read -r usedPort
+ '[' 32402 = 32383 ']'
+ IFS=
+ read -r usedPort
+ false
+ echo 32402
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins8646096636002379869.sh
+ set -xo pipefail
+ eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> getAvailablePort 32402 32767
++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> getAvailablePort 32402 32767
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ getAvailablePort 32402 32767
+ local lowRangePort=32403
+ local highRangePort=32767
+ local used=false
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc --all-namespaces -o   go-template='\''{{range .items}}{{range.spec.ports}}{{if .nodePort}}{{.nodePort}}{{"\n"}}{{end}}{{end}}{{end}}'\'''
+ local usedPorts
+ sed 's/^/KAFKA_SERVICE_PORT_2=/'
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc --all-namespaces -o   go-template='\''{{range .items}}{{range.spec.ports}}{{if .nodePort}}{{.nodePort}}{{"\n"}}{{end}}{{end}}{{end}}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc --all-namespaces -o 'go-template={{range .items}}{{range.spec.ports}}{{if .nodePort}}{{.nodePort}}{{"\n"}}{{end}}{{end}}{{end}}'
+ usedPorts='31977
30409
32383'
+ local availablePort=32403
++ seq 32403 32767
+ for i in $(seq $lowRangePort $highRangePort)
+ IFS=
+ read -r usedPort
+ '[' 32403 = 31977 ']'
+ IFS=
+ read -r usedPort
+ '[' 32403 = 30409 ']'
+ IFS=
+ read -r usedPort
+ '[' 32403 = 32383 ']'
+ IFS=
+ read -r usedPort
+ false
+ echo 32403
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins7139051424248758724.sh
+ sed -i -e s/32400/32401/ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-0.yml>
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins4570473093850874041.sh
+ sed -i -e s/32401/32402/ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-1.yml>
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins1917301965231362792.sh
+ sed -i -e s/32402/32403/ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-2.yml>
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins8044922198775079476.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster'>
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
storageclass.storage.k8s.io/kafka-broker unchanged
storageclass.storage.k8s.io/kafka-zookeeper unchanged
clusterrole.rbac.authorization.k8s.io/node-reader unchanged
clusterrolebinding.rbac.authorization.k8s.io/kafka-node-reader configured
role.rbac.authorization.k8s.io/pod-labler created
rolebinding.rbac.authorization.k8s.io/kafka-pod-labler created
configmap/zookeeper-config created
service/pzoo created
service/zookeeper created
statefulset.apps/pzoo created
service/outside-0 created
service/outside-1 created
service/outside-2 created
configmap/broker-config created
service/broker created
service/bootstrap created
statefulset.apps/kafka created
configmap/kafka-config created
job.batch/kafka-config-eff079ec created
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins7909273448409796808.sh
+ set -eo pipefail
+ sed 's/^/KAFKA_BROKER_0=/'
+ eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-0
++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-0
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ loadBalancerIP outside-0
+ local name=outside-0
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 1 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 2 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 3 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 4 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 5 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n '' ]]
+ [[ 6 == \3\6 ]]
+ sleep 10
+ (( i++ ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-0 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=34.28.75.192
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 34.28.75.192 ]]
+ echo 34.28.75.192
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins3315193389621810233.sh
+ set -eo pipefail
+ eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-1
++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-1
+ sed 's/^/KAFKA_BROKER_1=/'
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ loadBalancerIP outside-1
+ local name=outside-1
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-1 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-1 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-1 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-1 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-1 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=34.170.25.67
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 34.170.25.67 ]]
+ echo 34.170.25.67
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins6056950572421862507.sh
+ set -eo pipefail
+ eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-2
++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> loadBalancerIP outside-2
+ sed 's/^/KAFKA_BROKER_2=/'
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ loadBalancerIP outside-2
+ local name=outside-2
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-2 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ retry 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-2 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\''' 36 10
+ local 'command=kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-2 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+ local max_retries=36
+ local sleep_time=10
+ (( i = 1 ))
+ (( i <= max_retries ))
+ local output
++ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-2 -ojsonpath='\''{.status.loadBalancer.ingress[0].ip}'\'''
+++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 get svc outside-2 '-ojsonpath={.status.loadBalancer.ingress[0].ip}'
+ output=35.193.104.48
+ local status=0
+ [[ 0 == 0 ]]
+ [[ -n 35.193.104.48 ]]
+ echo 35.193.104.48
+ return 0
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties file path 'job.properties'
[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins8052996834254211554.sh
+ eval <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> waitForJob job.batch/kafka-config-eff079ec 40m
++ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> waitForJob job.batch/kafka-config-eff079ec 40m
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-3581
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581'
+ waitForJob job.batch/kafka-config-eff079ec 40m
+ echo 'Waiting for job completion...'
Waiting for job completion...
+ jobName=job.batch/kafka-config-eff079ec
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 wait --for=condition=complete --timeout=40m job.batch/kafka-config-eff079ec'
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-3581> --namespace=beam-performancetests-kafka-io-3581 wait --for=condition=complete --timeout=40m job.batch/kafka-config-eff079ec
error: timed out waiting for the condition on jobs/kafka-config-eff079ec
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org