You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/11/11 02:28:48 UTC

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1527

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1527/display/redirect>

Changes:


------------------------------------------
[...truncated 724.79 KB...]
    INFO: Uploading 209 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-uxcC8XNP8aCUy6x3v6I0wkjRT0BC3xtdpi3_h30oK5Y.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-qGs-u6adpupdMmB1RhmcUuCFtU4-1n5b_azNov1CL3Q.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-n59WOqPklZsaGNYFXSbNv_g60VP5DPMFRsLqJDILjGA.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-g1xC9t_0W5Hlt5puxJxC8eSF6t0E6Ry3iN4eoMN_Z0Y.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-VfZIdppuDQQ8fOVFiHc9YDrgJy7ggC9W5dRtIRA7gj0.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-GdmQOv6D-sScmCwTbqDs7bK5LIqFs_uR4goHW1RIyQA.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT-L4nvZa6AXFn55Z06vvHYTR0KhEzXvUZxnNSaRIHbFis.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-PwE0mjt84vjZwEDgelUCfjFqPJSok9MH3bDOw6vanmg.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/main1309005404095418629.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-KUzNqS_kn-mqe2eL9sPYn1g9bV_-0JgbhsSY5-A5_DI.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-0Wk7hRof89t1Ruqmyf_tL0ID4QY5wOLoT3IvhE7SS4Y.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-v8EI_0XOagSw9yhsc-ungsSpMMFR-fubIXEOehOcmoU.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-04lNEqMH4HXxaUOkgUKkmh5KjbLFOcWXLu4XBtxNvMw.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test185081093958747600.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-_1D5Wq5wrpgQPIDdra5mCA4IKNijNu_FGDwRh33jHGQ.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-F2S20v95WIldQ4OO-RWBQY26SnvisNz2E2OXii0gicw.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-EkIS3TABj_kwWmkDYmUW5xAcN7cXywqpfqRqriX1GhI.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-ygC50sV8JOTc-GJ-47M9A5z5VynmFi9kAJRymK8L8M0.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-UTC-79Phd0nv2e8xQWRJgiE2C5RJC3ik8xJtXHe8EJA.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-LOZCSmPn9h6OxaCLAtXz3yUZuIa4MDhy39DwzRxMliI.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-MVp7vmvuSpsTdtwMRC7zlq-7bhb-o_B7kqojJGCB9GY.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-SLiW9YElUzQgTZ55RBYJKmXhN_X_k4kVr4NjcbrSXqE.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-zgPDFjwVVMtfB0pX8uAH7l_5vu9HGgs-EQBNtFLPIdk.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-zxXD89uTQYXQ6-nhhuVA9i3wtXoUQ-BtoG4HmjQvZK0.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT-qsiVb1z0qz15u-gYfqFtlKWQUiqEDP1kAZYkuOliRlc.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-u360jAotUMAZAu-Ruab7VKIYC2Kt9LP9YEVhiw_IPSI.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.27.0-SNAPSHOT-L69V_oTyVsCafMK8agNDjhjgaK2sxJoIOiM2AWNuLl0.jar
    Nov 11, 2020 2:22:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.27.0-SNAPSHOT-hwIPcVjEhA5oHXd0ESXgHmoXDo1B6LVBP618ttfNg24.jar
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 26 files newly uploaded in 3 seconds
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s6
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <94557 bytes, hash da216bb2d08ab3f6778a3038f91f9a0f8f1dbb8cbe56b09c8ab0856ee6f3f87b> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-2iFrstCKs_Z3ijA4-R-aD48du4y-VrCcirCFbubz-Hs.pb
    Nov 11, 2020 2:22:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 2:22:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_18_22_50-6112486370799639190?project=apache-beam-testing
    Nov 11, 2020 2:22:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_18_22_50-6112486370799639190
    Nov 11, 2020 2:22:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_18_22_50-6112486370799639190
    Nov 11, 2020 2:23:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:04.108Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.157Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.198Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.265Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.298Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.394Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.429Z: Fusing consumer Generate records/ParDo(OutputSingleSource) into Generate records/Impulse
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.474Z: Fusing consumer s3/PairWithRestriction into Generate records/ParDo(OutputSingleSource)
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.511Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.550Z: Fusing consumer Measure write time into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.581Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:05.620Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:06.020Z: Executing operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 2:23:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:06.111Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 2:23:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:17.545Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 2:23:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:37.788Z: Autoscaling: Raised the number of ****s to 4 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 2:23:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:37.826Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 2:23:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:23:48.148Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 2:24:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:24:02.835Z: Workers have started successfully.
    Nov 11, 2020 2:24:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:24:02.870Z: Workers have started successfully.
    Nov 11, 2020 2:24:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:24:45.663Z: Finished operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 2:24:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:24:45.845Z: Executing operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 2:27:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:27:42.536Z: Finished operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 2:27:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:27:42.707Z: Cleaning up.
    Nov 11, 2020 2:27:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:27:42.805Z: Stopping **** pool...
    Nov 11, 2020 2:28:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:28:33.386Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 2:28:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:28:33.442Z: Worker pool stopped.
    Nov 11, 2020 2:28:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_18_22_50-6112486370799639190 finished with status DONE.
    Nov 11, 2020 2:28:39 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 2:28:40 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 2:28:40 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 2:28:40 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 2:28:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 FAILED
    java.lang.IllegalArgumentException: unable to serialize DoFnWithExecutionInformation{doFn=org.apache.beam.sdk.io.kafka.KafkaIOIT$1@4fd017fc, mainOutputTag=Tag<org.apache.beam.sdk.values.PCollection.<init>:402#bf09ad1bfb1f38d6>, sideInputMapping={}, schemaInformation=DoFnSchemaInformation{elementConverters=[]}}
        at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at org.apache.beam.runners.core.construction.ParDoTranslation.translateDoFn(ParDoTranslation.java:692)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator$1.translateDoFn(PrimitiveParDoSingleFactory.java:218)
        at org.apache.beam.runners.core.construction.ParDoTranslation.payloadForParDoLike(ParDoTranslation.java:814)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator.payloadForParDoSingle(PrimitiveParDoSingleFactory.java:214)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator.translate(PrimitiveParDoSingleFactory.java:163)
        at org.apache.beam.runners.core.construction.PTransformTranslation$KnownTransformPayloadTranslator.translate(PTransformTranslation.java:428)
        at org.apache.beam.runners.core.construction.PTransformTranslation.toProto(PTransformTranslation.java:238)
        at org.apache.beam.runners.core.construction.SdkComponents.registerPTransform(SdkComponents.java:175)
        at org.apache.beam.runners.core.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:87)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:587)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:579)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:239)
        at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:213)
        at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:468)
        at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:927)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:353)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:334)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2(KafkaIOIT.java:154)

        Caused by:
        java.io.NotSerializableException: org.apache.beam.sdk.io.kafka.KafkaIOIT
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
            at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
            at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
            ... 21 more

1 test completed, 1 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 6 mins 2.817 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111021811
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111021811
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c0f99c31d6c773adeb34a38fb4630888e0fc8303041137b515d111da75dd5daf
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111021811
Successfully started process 'command 'gcloud''
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c0f99c31d6c773adeb34a38fb4630888e0fc8303041137b515d111da75dd5daf
  Associated tags:
 - 20201111021811
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111021811
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111021811].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c0f99c31d6c773adeb34a38fb4630888e0fc8303041137b515d111da75dd5daf].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 4.192 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 6,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 49s
133 actionable tasks: 99 executed, 32 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/546uqflhkteqg

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Kafka_IO #1536

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1536/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1535

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1535/display/redirect>

Changes:


------------------------------------------
[...truncated 320.15 KB...]
    Nov 11, 2020 7:41:58 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT-y0V9-VwQQfjtKBE-RJIcHHxMVXXr21XBc7_ZFuxttiY.jar
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 210 files cached, 0 files newly uploaded in 0 seconds
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records as step s1
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s2
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s3
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s4
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <91366 bytes, hash 8a5ecd28698cf70adbff3ed021657e54e96d74a5f63d8dc5875f4973137929dc> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-il7NKGmM9wrb_z7QIWV-VOltdKX2PY3Fh19JcxN5Kdw.pb
    Nov 11, 2020 7:41:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 7:42:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_23_41_59-7205697585914388330?project=apache-beam-testing
    Nov 11, 2020 7:42:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_23_41_59-7205697585914388330
    Nov 11, 2020 7:42:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_23_41_59-7205697585914388330
    Nov 11, 2020 7:42:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:10.781Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.479Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.532Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.679Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.759Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.793Z: Fusing consumer Measure write time into Generate records
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.824Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:11.879Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:12.422Z: Executing operation Generate records+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 7:42:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:12.521Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 7:42:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:35.846Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 7:42:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:43.372Z: Autoscaling: Raised the number of ****s to 4 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 7:42:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:43.402Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 7:42:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:42:53.734Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 7:43:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:43:04.374Z: Workers have started successfully.
    Nov 11, 2020 7:43:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:43:04.398Z: Workers have started successfully.
    Nov 11, 2020 7:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:46:14.098Z: Finished operation Generate records+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 7:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:46:14.244Z: Cleaning up.
    Nov 11, 2020 7:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:46:14.325Z: Stopping **** pool...
    Nov 11, 2020 7:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:07.522Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 7:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:07.566Z: Worker pool stopped.
    Nov 11, 2020 7:47:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_23_41_59-7205697585914388330 finished with status DONE.
    Nov 11, 2020 7:47:15 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 7:47:15 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 7:47:15 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 7:47:15 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 7:47:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 11, 2020 7:47:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 7:47:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT-y0V9-VwQQfjtKBE-RJIcHHxMVXXr21XBc7_ZFuxttiY.jar
    Nov 11, 2020 7:47:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 210 files cached, 0 files newly uploaded in 2 seconds
    Nov 11, 2020 7:47:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
    Nov 11, 2020 7:47:19 AM org.apache.kafka.common.config.AbstractConfig logAll
    INFO: ConsumerConfig values: 
    	auto.commit.interval.ms = 5000
    	auto.offset.reset = earliest
    	bootstrap.servers = [35.188.14.22:32400, 35.225.34.64:32401, 34.67.109.71:32402]
    	check.crcs = true
    	client.id = 
    	connections.max.idle.ms = 540000
    	enable.auto.commit = false
    	exclude.internal.topics = true
    	fetch.max.bytes = 52428800
    	fetch.max.wait.ms = 500
    	fetch.min.bytes = 1
    	group.id = 
    	heartbeat.interval.ms = 3000
    	interceptor.classes = null
    	internal.leave.group.on.close = true
    	isolation.level = read_uncommitted
    	key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    	max.partition.fetch.bytes = 1048576
    	max.poll.interval.ms = 300000
    	max.poll.records = 500
    	metadata.max.age.ms = 300000
    	metric.reporters = []
    	metrics.num.samples = 2
    	metrics.recording.level = INFO
    	metrics.sample.window.ms = 30000
    	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    	receive.buffer.bytes = 524288
    	reconnect.backoff.max.ms = 1000
    	reconnect.backoff.ms = 50
    	request.timeout.ms = 305000
    	retry.backoff.ms = 100
    	sasl.jaas.config = null
    	sasl.kerberos.kinit.cmd = /usr/bin/kinit
    	sasl.kerberos.min.time.before.relogin = 60000
    	sasl.kerberos.service.name = null
    	sasl.kerberos.ticket.renew.jitter = 0.05
    	sasl.kerberos.ticket.renew.window.factor = 0.8
    	sasl.mechanism = GSSAPI
    	security.protocol = PLAINTEXT
    	send.buffer.bytes = 131072
    	session.timeout.ms = 10000
    	ssl.cipher.suites = null
    	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    	ssl.endpoint.identification.algorithm = null
    	ssl.key.password = null
    	ssl.keymanager.algorithm = SunX509
    	ssl.keystore.location = null
    	ssl.keystore.password = null
    	ssl.keystore.type = JKS
    	ssl.protocol = TLS
    	ssl.provider = null
    	ssl.secure.random.implementation = null
    	ssl.trustmanager.algorithm = PKIX
    	ssl.truststore.location = null
    	ssl.truststore.password = null
    	ssl.truststore.type = JKS
    	value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

    Nov 11, 2020 7:47:19 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka version : 1.0.0
    Nov 11, 2020 7:47:19 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka commitId : aaa7af6d4a11b29d
    Nov 11, 2020 7:47:20 AM org.apache.beam.sdk.io.kafka.KafkaUnboundedSource split
    INFO: Partitions assigned to split 0 (total 1): beam-0
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds as step s2
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s3
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s4
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s5
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <90203 bytes, hash 4dd4cf11625a0a68188e84ae28750af8ecd4a2e61163cc520358d2fb8108764f> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-TdTPEWJaCmgYjoSuKHUK-OzUouYRY8xSA1jS-4EIdk8.pb
    Nov 11, 2020 7:47:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 7:47:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_23_47_20-9329782067611972105?project=apache-beam-testing
    Nov 11, 2020 7:47:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_23_47_20-9329782067611972105
    Nov 11, 2020 7:47:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_23_47_20-9329782067611972105
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:33.961Z: Worker configuration: n1-standard-4 in us-central1-f.
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.596Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.605Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.608Z: Expanding GroupByKey operations into streaming Read/Write steps
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.610Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.626Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.628Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.630Z: Fusing consumer Measure read time into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.633Z: Fusing consumer Map records to strings/Map into Measure read time
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.635Z: Fusing consumer Counting element into Map records to strings/Map
    Nov 11, 2020 7:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:34.903Z: Starting 5 ****s...
    Nov 11, 2020 7:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:37.162Z: Executing operation Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds+Measure read time+Map records to strings/Map+Counting element
    Nov 11, 2020 7:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:47:48.801Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 7:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:48:14.687Z: Worker configuration: n1-standard-4 in us-central1-f.
    Nov 11, 2020 7:48:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T07:48:18.456Z: Workers have started successfully.
    Nov 11, 2020 8:02:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 FAILED
    java.lang.AssertionError: expected:<100000000> but was:<34976566>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2(KafkaIOIT.java:144)

2 tests completed, 1 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':',5,main]) completed. Took 41 mins 29.955 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 37s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/bnpjfz3bjq5ak

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1534

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1534/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-9547] stubs for non-implemented IO.

[Robert Bradshaw] Allow not/won't imlement helpers to be used for functions as well as

[Kyle Weaver] [BEAM-9855] Fix merge conflict between #13116 and #13240.


------------------------------------------
[...truncated 236.33 KB...]
Task ':runners:java-fn-execution:jar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/java-fn-execution/build/resources/main',> not found
:runners:java-fn-execution:jar (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.05 secs.
:runners:direct-java:compileJava (Thread[Execution **** for ':' Thread 4,5,main]) started.
:sdks:java:expansion-service:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:expansion-service:compileJava FROM-CACHE
Custom actions are attached to task ':sdks:java:expansion-service:compileJava'.
Build cache key for task ':sdks:java:expansion-service:compileJava' is 3440040ab3e5557e56ca1eeb3d7ce7f5
Task ':sdks:java:expansion-service:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:expansion-service:compileJava' with cache key 3440040ab3e5557e56ca1eeb3d7ce7f5
:sdks:java:expansion-service:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.04 secs.
:sdks:java:expansion-service:classes (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:expansion-service:classes UP-TO-DATE
Skipping task ':sdks:java:expansion-service:classes' as it has no actions.
:sdks:java:expansion-service:classes (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:expansion-service:jar (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:expansion-service:jar
Caching disabled for task ':sdks:java:expansion-service:jar' because:
  Caching has not been enabled for the task
Task ':sdks:java:expansion-service:jar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/expansion-service/build/resources/main',> not found
:sdks:java:expansion-service:jar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.008 secs.
:sdks:java:io:google-cloud-platform:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) started.
:sdks:java:io:kafka:compileJava (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :runners:direct-java:compileJava FROM-CACHE
Custom actions are attached to task ':runners:direct-java:compileJava'.
Build cache key for task ':runners:direct-java:compileJava' is 8856d4ced88c96137b17c8017b30937c
Task ':runners:direct-java:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:direct-java:compileJava' with cache key 8856d4ced88c96137b17c8017b30937c
:runners:direct-java:compileJava (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.095 secs.
:runners:direct-java:classes (Thread[Execution **** for ':' Thread 4,5,main]) started.

> Task :runners:direct-java:classes UP-TO-DATE
Skipping task ':runners:direct-java:classes' as it has no actions.
:runners:direct-java:classes (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:runners:direct-java:shadowJar (Thread[Daemon ****,5,main]) started.

> Task :sdks:java:io:kafka:compileJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:kafka:compileJava'.
Build cache key for task ':sdks:java:io:kafka:compileJava' is ced358201888b4e63668bc7a517f6466
Task ':sdks:java:io:kafka:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:kafka:compileJava' with cache key ced358201888b4e63668bc7a517f6466
:sdks:java:io:kafka:compileJava (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.084 secs.
:sdks:java:io:kafka:classes (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:kafka:classes UP-TO-DATE
Skipping task ':sdks:java:io:kafka:classes' as it has no actions.
:sdks:java:io:kafka:classes (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:compileTestJava (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:kafka:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:kafka:compileTestJava'.
Build cache key for task ':sdks:java:io:kafka:compileTestJava' is 434e676acd059e7c18be975279b62535
Task ':sdks:java:io:kafka:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:kafka:compileTestJava' with cache key 434e676acd059e7c18be975279b62535
:sdks:java:io:kafka:compileTestJava (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.118 secs.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:kafka:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:kafka:testClasses' as it has no actions.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileJava' is b4769fe2723866cefd7011804eed753e
Task ':sdks:java:io:google-cloud-platform:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileJava' with cache key b4769fe2723866cefd7011804eed753e
:sdks:java:io:google-cloud-platform:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.282 secs.
:sdks:java:io:google-cloud-platform:classes (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
Skipping task ':sdks:java:io:google-cloud-platform:classes' as it has no actions.
:sdks:java:io:google-cloud-platform:classes (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:jar (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:jar
Caching disabled for task ':sdks:java:io:google-cloud-platform:jar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:jar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/resources/main',> not found
:sdks:java:io:google-cloud-platform:jar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.16 secs.
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileJava' is 4fd0d12757f0faac921353ad7a6f77da
Task ':runners:google-cloud-dataflow-java:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileJava' with cache key 4fd0d12757f0faac921353ad7a6f77da
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.162 secs.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:classes
Skipping task ':runners:google-cloud-dataflow-java:classes' as it has no actions.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:jar
Caching disabled for task ':runners:google-cloud-dataflow-java:jar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:jar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.068 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 4385a442e08df540bcfd3d520f41e90f
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 4385a442e08df540bcfd3d520f41e90f
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.319 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:direct-java:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/resources/main'.>
Custom actions are attached to task ':runners:direct-java:shadowJar'.
Caching disabled for task ':runners:direct-java:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/resources/main',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 0.696s [696ms]
Average Time/Jar: 0.116s [116.0ms]
*******************
:runners:direct-java:shadowJar (Thread[Daemon ****,5,main]) completed. Took 1.011 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Daemon ****,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 2a2b28b077e46339c27e78e76e8b5d76
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 2a2b28b077e46339c27e78e76e8b5d76
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Daemon ****,5,main]) completed. Took 0.454 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Daemon ****,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Daemon ****,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Daemon ****,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Daemon ****,5,main]) completed. Took 0.223 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Daemon ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 8ddc79afe37289ac6d52f6bddb888dd9
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 8ddc79afe37289ac6d52f6bddb888dd9
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Daemon ****,5,main]) completed. Took 0.201 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Daemon ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Daemon ****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Daemon ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.>
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Daemon ****,5,main]) completed. Took 0.036 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main'.>
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package'.>
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Caching disabled for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 3.352s [3352ms]
Average Time/Jar: 0.2095s [209.5ms]
*******************
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 4.351 secs.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 3,5,main]) started.
Gradle Test Executor 1 started executing tests.
Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is c9ef61266b69cd47a6814d55059abcb0
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results_sdf_wrapper","--influxMeasurement=kafkaioit_results_sdf_wrapper","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=35.226.168.127:32400,35.223.133.170:32401,35.222.241.99:32402","--kafkaTopic=beam-runnerv2","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--experiments=beam_fn_api,use_runner_v2,use_unified_****","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 1.807 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> No tests found for given includes: [**/*IT.class](include rules) [**/JvmVerification.class](exclude rules) [org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2](--tests filter)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/y7deskuvmq4nm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1533

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1533/display/redirect>

Changes:


------------------------------------------
[...truncated 760.80 KB...]
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-dHZJ-he8GnieNf2FvQWQ12yQrKJ9fc1mLI_53868qhU.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-oXlFvQNWqI-TjFBDmB9P5dT2cvm-OUA9cJQto-euHd4.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-t11a4QWd0Vj-pZha3XBUfMYjfc9vG1mZ8wO_xzrzbws.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7448412820660774831.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-yL1nMHj3krcrKIfEyEdbW90_6C4xrOt8PnUCABvR8-0.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT-5zUl9f8daNBhIKK4sdQrDlhupuSMEjHKlWEYCQ0apC0.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests--fe6SAyMEnUS4CxQ9DffPrSzD9K0R8WrLj7-9ym1Jm0.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-WDLfrhsrJT8tNXhADm1PbJPj5FU9JnnhKELjCI8jwbg.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.27.0-SNAPSHOT-NbQ87VL6xWLNThM0XD6uKL-XB_N8hf85APstO1593xE.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT-f_myj6jNEUz8EahScIvleYFRlHLEqM4FOJ4tVwf079k.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-ErI3PIM6qRcJdSuafqaTe_pRUYXf2k4Y3dBYh0EOzTM.jar
    Nov 11, 2020 6:31:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.27.0-SNAPSHOT-osDRALLvI1eWoAqKgGmyDqJSRNl4ExB8AGiavMIYg10.jar
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 26 files newly uploaded in 1 seconds
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s6
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <94534 bytes, hash 67586b70e96cf0d8ef8b51782723c687acdac752e9a2d19e3f5f9497bf9d6ff1> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-Z1hrcOls8Njvi1F4JyPGh6zax1LpotGeP1-Ul7-db_E.pb
    Nov 11, 2020 6:31:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 6:31:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_22_31_33-1792495779117427463?project=apache-beam-testing
    Nov 11, 2020 6:31:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_22_31_33-1792495779117427463
    Nov 11, 2020 6:31:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_22_31_33-1792495779117427463
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:47.484Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.287Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.326Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.411Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.440Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.516Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.552Z: Fusing consumer Generate records/ParDo(OutputSingleSource) into Generate records/Impulse
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.584Z: Fusing consumer s3/PairWithRestriction into Generate records/ParDo(OutputSingleSource)
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.617Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.651Z: Fusing consumer Measure write time into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.682Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:48.717Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:49.115Z: Executing operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 6:31:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:31:49.196Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 6:32:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:13.661Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 6:32:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:17.379Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 6:32:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:17.409Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 6:32:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:27.654Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 6:32:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:42.344Z: Workers have started successfully.
    Nov 11, 2020 6:32:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:32:42.374Z: Workers have started successfully.
    Nov 11, 2020 6:33:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:33:22.328Z: Finished operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 6:33:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:33:22.491Z: Executing operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 6:33:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:33:43.721Z: Finished operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 6:33:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:33:43.887Z: Cleaning up.
    Nov 11, 2020 6:33:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:33:43.984Z: Stopping **** pool...
    Nov 11, 2020 6:34:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:30.632Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 6:34:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:30.703Z: Worker pool stopped.
    Nov 11, 2020 6:34:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_22_31_33-1792495779117427463 finished with status DONE.
    Nov 11, 2020 6:34:37 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 6:34:37 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 6:34:37 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 6:34:38 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 6:34:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 209 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 209 files cached, 0 files newly uploaded in 0 seconds
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Impulse as step s1
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(UnboundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds) as step s4
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s5
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s6
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s7
    Nov 11, 2020 6:34:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 6:34:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98505 bytes, hash b2bcd270a85b8f54524c5aac64a09775e436797ab69e004a9ca912bfd378bbbe> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-srzScKhbj1RSTFqsZKCXdeQ2eXq2ngBKnKkSv9N4u74.pb
    Nov 11, 2020 6:34:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 6:34:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_22_34_40-959049894787545051?project=apache-beam-testing
    Nov 11, 2020 6:34:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_22_34_40-959049894787545051
    Nov 11, 2020 6:34:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_22_34_40-959049894787545051
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:55.286Z: Worker configuration: n1-standard-4 in us-central1-f.
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.154Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.188Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.245Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.286Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.319Z: Expanding GroupByKey operations into streaming Read/Write steps
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.352Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.428Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.467Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource) into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Impulse
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.530Z: Fusing consumer s3/PairWithRestriction into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource)
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.564Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.591Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds) into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.626Z: Fusing consumer Measure read time into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds)
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.662Z: Fusing consumer Map records to strings/Map into Measure read time
    Nov 11, 2020 6:34:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:56.694Z: Fusing consumer Counting element into Map records to strings/Map
    Nov 11, 2020 6:35:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:34:58.297Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 6:35:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:35:13.947Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 6:35:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:35:26.080Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 6:35:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:35:26.119Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 6:35:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:35:36.444Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 6:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:36:04.504Z: Workers have started successfully.
    Nov 11, 2020 6:36:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T06:36:04.546Z: Workers have started successfully.
    Nov 11, 2020 6:49:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 STANDARD_OUT
    Load test results for test (ID): 4d232da1-529f-4425-a8db-5e6221f607e7 and timestamp: 2020-11-11T06:31:26.129000000Z:
                     Metric:                    Value:
                   read_time                     1.051
                  write_time                    14.386
                    run_time                    15.437

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest
Finished generating test XML results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.061 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':sdks:java:io:kafka:integrationTest' with cache key 296b644f572dcc1235ec5631b6adc390
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 18 mins 21.418 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111062352
Successfully started process 'command 'docker''
Error response from daemon: conflict: unable to remove repository reference "us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111062352" (must force) - container de3fd4d54a1a is using its referenced image efc86cf5a958
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 0.21 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 261

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'docker'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 18s
133 actionable tasks: 110 executed, 21 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cmc5qwygti2d2

Stopped 4 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1532

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1532/display/redirect>

Changes:


------------------------------------------
[...truncated 705.56 KB...]
  No history is available.
Cache <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/container/.gogradle/cache/PersistenceNotationToResolvedCache-0.10.bin> not found, skip.
Cache <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/container/.gogradle/cache/PersistenceResolvedToDependenciesCache-0.10.bin> not found, skip.
Resolving ./github.com/apache/beam/sdks/go@<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/go>
:sdks:java:container:resolveBuildDependencies (Thread[Execution **** for ':',5,main]) completed. Took 3.377 secs.
:sdks:java:container:installDependencies (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:container:installDependencies
Caching disabled for task ':sdks:java:container:installDependencies' because:
  Caching has not been enabled for the task
Task ':sdks:java:container:installDependencies' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Cache <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/container/.gogradle/cache/VendorSnapshot-0.10.bin> not found, skip.
:sdks:java:container:installDependencies (Thread[Execution **** for ':',5,main]) completed. Took 1.545 secs.
:sdks:java:container:buildLinuxAmd64 (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:container:buildLinuxAmd64
Caching disabled for task ':sdks:java:container:buildLinuxAmd64' because:
  Caching has not been enabled for the task
Task ':sdks:java:container:buildLinuxAmd64' is not up-to-date because:
  No history is available.
:sdks:java:container:buildLinuxAmd64 (Thread[Execution **** for ':',5,main]) completed. Took 5.361 secs.
:sdks:java:container:goBuild (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:container:goBuild
Caching disabled for task ':sdks:java:container:goBuild' because:
  Caching has not been enabled for the task
Task ':sdks:java:container:goBuild' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:sdks:java:container:goBuild (Thread[Execution **** for ':',5,main]) completed. Took 0.001 secs.
:sdks:java:container:dockerPrepare (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:container:dockerPrepare
Caching disabled for task ':sdks:java:container:dockerPrepare' because:
  Caching has not been enabled for the task
Task ':sdks:java:container:dockerPrepare' is not up-to-date because:
  No history is available.
:sdks:java:container:dockerPrepare (Thread[Execution **** for ':',5,main]) completed. Took 4.257 secs.
:sdks:java:container:docker (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:container:docker
Caching disabled for task ':sdks:java:container:docker' because:
  Caching has not been enabled for the task
Task ':sdks:java:container:docker' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/container/build/docker> Command: docker build --no-cache --build-arg pull_licenses=true --build-arg java_version=8 -t apache/beam_java8_sdk:2.27.0.dev .
Successfully started process 'command 'docker''
Sending build context to Docker daemon  381.2MB
Step 1/17 : ARG java_version
Step 2/17 : FROM openjdk:${java_version}
 ---> 3edb5f36304e
Step 3/17 : MAINTAINER "Apache Beam <de...@beam.apache.org>"
 ---> Running in 56327b76f399
Removing intermediate container 56327b76f399
 ---> 2e7e46307729
Step 4/17 : ARG pull_licenses
 ---> Running in dbffc294dcb6
Removing intermediate container dbffc294dcb6
 ---> 7d6c64685793
Step 5/17 : ADD target/slf4j-api.jar /opt/apache/beam/jars/
 ---> c7aa8ff237e0
Step 6/17 : ADD target/slf4j-jdk14.jar /opt/apache/beam/jars/
 ---> f34ac52e9513
Step 7/17 : ADD target/beam-sdks-java-harness.jar /opt/apache/beam/jars/
 ---> fe23f963baec
Step 8/17 : ADD target/beam-sdks-java-io-kafka.jar /opt/apache/beam/jars/
 ---> d05c1b06b3f4
Step 9/17 : ADD target/kafka-clients.jar /opt/apache/beam/jars/
 ---> aca7fe477fe8
Step 10/17 : ADD target/linux_amd64/boot /opt/apache/beam/
 ---> e33690f51d81
Step 11/17 : COPY target/LICENSE /opt/apache/beam/
 ---> 8cf669e1e579
Step 12/17 : COPY target/NOTICE /opt/apache/beam/
 ---> b2c3021815a8
Step 13/17 : ADD target/third_party_licenses /opt/apache/beam/third_party_licenses/
 ---> e9319a77c4f1
Step 14/17 : COPY target/LICENSE target/go-licenses/* /opt/apache/beam/third_party_licenses/golang/
 ---> 1330ce76fe45
Step 15/17 : RUN rm /opt/apache/beam/third_party_licenses/golang/LICENSE
 ---> Running in 1416160d0606
Removing intermediate container 1416160d0606
 ---> b20a7577cb1e
Step 16/17 : RUN if [ "${pull_licenses}" = "false" ] ; then     rm -rf /opt/apache/beam/third_party_licenses ;    fi
 ---> Running in 0ee5978540fb
Removing intermediate container 0ee5978540fb
 ---> db09ee101e3b
Step 17/17 : ENTRYPOINT ["/opt/apache/beam/boot"]
 ---> Running in 86f74d822458
Removing intermediate container 86f74d822458
 ---> fd578f41d188
Successfully built fd578f41d188
Successfully tagged apache/beam_java8_sdk:2.27.0.dev
:sdks:java:container:docker (Thread[Execution **** for ':',5,main]) completed. Took 24.064 secs.
:runners:google-cloud-dataflow-java:buildAndPushDockerContainer (Thread[Execution **** for ':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
Custom actions are attached to task ':runners:google-cloud-dataflow-java:buildAndPushDockerContainer'.
Caching disabled for task ':runners:google-cloud-dataflow-java:buildAndPushDockerContainer' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:buildAndPushDockerContainer' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker tag apache/beam_java8_sdk:2.27.0.dev us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145
Successfully started process 'command 'docker''
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud docker -- push us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command 'gcloud''
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
d74fbd158493: Preparing
afa3767ee9b4: Preparing
c4c611ac856d: Preparing
338fbf190254: Preparing
1724a94ca052: Preparing
0c60081281f2: Preparing
dea6f14d1a1a: Preparing
f9c4d2aa9230: Preparing
e3b0ea7e3eba: Preparing
cd9684590272: Preparing
fb0d9a67ce3c: Preparing
29c7de453b8e: Preparing
144903481aa9: Preparing
849ea2764450: Preparing
0c60081281f2: Waiting
f49d20b92dc8: Preparing
f9c4d2aa9230: Waiting
fe342cfe5c83: Preparing
630e4f1da707: Preparing
dea6f14d1a1a: Waiting
9780f6d83e45: Preparing
cd9684590272: Waiting
849ea2764450: Waiting
144903481aa9: Waiting
fb0d9a67ce3c: Waiting
9780f6d83e45: Waiting
630e4f1da707: Waiting
f49d20b92dc8: Waiting
afa3767ee9b4: Pushed
338fbf190254: Pushed
1724a94ca052: Pushed
d74fbd158493: Pushed
c4c611ac856d: Pushed
f9c4d2aa9230: Pushed
dea6f14d1a1a: Pushed
29c7de453b8e: Layer already exists
144903481aa9: Layer already exists
849ea2764450: Layer already exists
f49d20b92dc8: Layer already exists
fe342cfe5c83: Layer already exists
630e4f1da707: Layer already exists
9780f6d83e45: Layer already exists
cd9684590272: Pushed
fb0d9a67ce3c: Pushed
0c60081281f2: Pushed
e3b0ea7e3eba: Pushed
20201111054145: digest: sha256:e1b8712bdbd793cee3a3eb3f5c9de971995ebe34836bd3addd65caa6cf2d527f size: 4098
:runners:google-cloud-dataflow-java:buildAndPushDockerContainer (Thread[Execution **** for ':',5,main]) completed. Took 7.906 secs.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':',5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:kafka:integrationTest
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is 3cf707aa6b0c338fa66891ece7576cc4
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results_sdf_wrapper","--influxMeasurement=kafkaioit_results_sdf_wrapper","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=34.69.183.210:32400,35.238.232.105:32401,35.202.128.13:32402","--kafkaTopic=beam-runnerv2","--readTimeout=1200","--numWorkers=5","--autoscalingAlgorithm=NONE","--experiments=beam_fn_api,use_runner_v2,use_unified_****","--****HarnessContainerImage=us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145","--region=us-central1"] -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.io.kafka.KafkaIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > classMethod FAILED
    java.lang.UnsupportedOperationException: No hash for that record count: 100000
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.getHashForRecordCount(KafkaIOIT.java:296)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.setup(KafkaIOIT.java:115)

1 test completed, 1 failed
Finished generating test XML results (0.014 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':',5,main]) completed. Took 3.581 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e1b8712bdbd793cee3a3eb3f5c9de971995ebe34836bd3addd65caa6cf2d527f
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145
Successfully started process 'command 'gcloud''
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e1b8712bdbd793cee3a3eb3f5c9de971995ebe34836bd3addd65caa6cf2d527f
  Associated tags:
 - 20201111054145
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111054145].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e1b8712bdbd793cee3a3eb3f5c9de971995ebe34836bd3addd65caa6cf2d527f].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':',5,main]) completed. Took 3.637 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':',5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 31s
133 actionable tasks: 97 executed, 34 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5vslpzuws5nic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PerformanceTests_Kafka_IO - Build # 1531 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PerformanceTests_Kafka_IO - Build # 1531 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1531/ to view the results.

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1530

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1530/display/redirect>

Changes:


------------------------------------------
[...truncated 734.02 KB...]
    Nov 11, 2020 4:38:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-WOpGwq0y6fJafFk7Fedv-dd95-TnXsWruit7pgBwnbU.jar
    Nov 11, 2020 4:38:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.27.0-SNAPSHOT-tHRBufhQSMn547qp5VE4x6Abwe6L5KNMI24XDTrm5Yo.jar
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 26 files newly uploaded in 2 seconds
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s6
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <94547 bytes, hash 54a5d7d30a41afe0451a5bd76ee8419bf0cd29ffee3b737c9ea2d81492e90950> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-VKXX0wpBr-BFGlvXbuhBm_DNKf_uO3N8nqLYFJLpCVA.pb
    Nov 11, 2020 4:38:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 4:38:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_20_38_47-7377754150461482857?project=apache-beam-testing
    Nov 11, 2020 4:38:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_20_38_47-7377754150461482857
    Nov 11, 2020 4:38:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_20_38_47-7377754150461482857
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:01.679Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.623Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.668Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.735Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.769Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.852Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.880Z: Fusing consumer Generate records/ParDo(OutputSingleSource) into Generate records/Impulse
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.915Z: Fusing consumer s3/PairWithRestriction into Generate records/ParDo(OutputSingleSource)
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.951Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:02.990Z: Fusing consumer Measure write time into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:03.015Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:03.052Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:03.536Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:03.636Z: Executing operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 4:39:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:03.710Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 4:39:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:34.803Z: Autoscaling: Raised the number of ****s to 1 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 4:39:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:34.834Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 4:39:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:45.158Z: Autoscaling: Raised the number of ****s to 2 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 4:39:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:39:45.184Z: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 4:40:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:40:06.556Z: Workers have started successfully.
    Nov 11, 2020 4:40:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:40:06.586Z: Workers have started successfully.
    Nov 11, 2020 4:40:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:40:52.509Z: Finished operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 4:40:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:40:52.671Z: Executing operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 4:40:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:40:57.398Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 4:44:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:44:21.947Z: Finished operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 4:44:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:44:22.225Z: Cleaning up.
    Nov 11, 2020 4:44:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:44:22.327Z: Stopping **** pool...
    Nov 11, 2020 4:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:06.031Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 4:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:06.092Z: Worker pool stopped.
    Nov 11, 2020 4:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_20_38_47-7377754150461482857 finished with status DONE.
    Nov 11, 2020 4:45:11 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 4:45:12 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 4:45:12 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 4:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 4:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 11, 2020 4:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 209 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 209 files cached, 0 files newly uploaded in 2 seconds
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Impulse as step s1
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(UnboundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds) as step s4
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s5
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s6
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s7
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98500 bytes, hash f64cdacf688844532e6c8e5c6923aaff0e8554ed0174ed687a4ed1f9f752024c> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-9kzaz2iIRFMubI5caSOq_w6FVO0BdO1oek7R-fdSAkw.pb
    Nov 11, 2020 4:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 4:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_20_45_16-7711305734114511973?project=apache-beam-testing
    Nov 11, 2020 4:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_20_45_16-7711305734114511973
    Nov 11, 2020 4:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_20_45_16-7711305734114511973
    Nov 11, 2020 4:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:33.297Z: Worker configuration: n1-standard-4 in us-central1-f.
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.165Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.201Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.266Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.313Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.353Z: Expanding GroupByKey operations into streaming Read/Write steps
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.378Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.459Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.493Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource) into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Impulse
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.532Z: Fusing consumer s3/PairWithRestriction into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(OutputSingleSource)
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.556Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.579Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds) into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.611Z: Fusing consumer Measure read time into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/ParDo(StripIds)
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.646Z: Fusing consumer Map records to strings/Map into Measure read time
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:34.680Z: Fusing consumer Counting element into Map records to strings/Map
    Nov 11, 2020 4:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:36.079Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 4:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:45:37.738Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 4:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:46:03.674Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 4:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:46:03.717Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 4:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:46:14.010Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 4:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:46:34.873Z: Workers have started successfully.
    Nov 11, 2020 4:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:46:34.908Z: Workers have started successfully.
    Nov 11, 2020 5:05:18 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 FAILED
    java.lang.AssertionError: expected:<758088> but was:<100000>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2(KafkaIOIT.java:149)

1 test completed, 1 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 26 mins 42.561 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111043423
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111043423
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4e322a6e446b50f236658ba5ffcaaff0aef33eb003ec34bedccca27a949b4517
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111043423
Successfully started process 'command 'gcloud''
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4e322a6e446b50f236658ba5ffcaaff0aef33eb003ec34bedccca27a949b4517
  Associated tags:
 - 20201111043423
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111043423
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111043423].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4e322a6e446b50f236658ba5ffcaaff0aef33eb003ec34bedccca27a949b4517].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 3.576 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 9,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 31m 11s
133 actionable tasks: 98 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/olcowrsa4eok2

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1529

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1529/display/redirect>

Changes:


------------------------------------------
[...truncated 737.99 KB...]
    Nov 11, 2020 4:01:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_20_01_28-15132547155710935350
    Nov 11, 2020 4:01:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_20_01_28-15132547155710935350
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:43.909Z: Worker configuration: n1-standard-1 in us-central1-b.
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:44.654Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:44.826Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:44.890Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:44.937Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.016Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.043Z: Fusing consumer Generate records/ParDo(OutputSingleSource) into Generate records/Impulse
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.067Z: Fusing consumer s3/PairWithRestriction into Generate records/ParDo(OutputSingleSource)
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.090Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.111Z: Fusing consumer Measure write time into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.131Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.164Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 4:01:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.722Z: Executing operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 4:01:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:45.832Z: Starting 5 ****s in us-central1-b...
    Nov 11, 2020 4:01:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:01:53.415Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 4:02:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:02:29.111Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 4:02:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:02:45.942Z: Workers have started successfully.
    Nov 11, 2020 4:02:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:02:45.975Z: Workers have started successfully.
    Nov 11, 2020 4:03:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:03:29.774Z: Finished operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 4:03:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:03:29.904Z: Executing operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 4:06:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:06:58.041Z: Finished operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 4:06:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:06:58.198Z: Cleaning up.
    Nov 11, 2020 4:06:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:06:58.274Z: Stopping **** pool...
    Nov 11, 2020 4:07:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:07:50.631Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 4:07:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:07:50.679Z: Worker pool stopped.
    Nov 11, 2020 4:07:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_20_01_28-15132547155710935350 finished with status DONE.
    Nov 11, 2020 4:07:56 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 4:07:57 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 4:07:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 4:07:57 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 4:07:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 11, 2020 4:07:58 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 209 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 209 files cached, 0 files newly uploaded in 2 seconds
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Create/Impulse as step s1
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Create/ParDo(DecodeAndEmit) as step s2
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Split as step s3
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key as step s4
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign as step s5
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey as step s6
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable as step s7
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map as step s8
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Read as step s9
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds as step s10
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s11
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s12
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s13
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <111313 bytes, hash da5792a9fe77b700986fce786a0561753fcaac7b60d09d20c3c160bbad4c4349> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-2leSqf53twCYb854agVhdT_KrHtg0J0gw8Fgu61MQ0k.pb
    Nov 11, 2020 4:08:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 4:08:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_20_08_01-2322589840664177372?project=apache-beam-testing
    Nov 11, 2020 4:08:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_20_08_01-2322589840664177372
    Nov 11, 2020 4:08:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_20_08_01-2322589840664177372
    Nov 11, 2020 4:08:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:16.448Z: Worker configuration: n1-standard-4 in us-central1-f.
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:17.830Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:17.870Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:17.934Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.013Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.101Z: Expanding GroupByKey operations into streaming Read/Write steps
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.164Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.277Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.315Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Create/ParDo(DecodeAndEmit) into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Create/Impulse
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.353Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Split into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Create/ParDo(DecodeAndEmit)
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.379Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Split
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.428Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.476Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/WriteStream into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.512Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/MergeBuckets into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/ReadStream
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.547Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/MergeBuckets
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.584Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.620Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Read into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.653Z: Fusing consumer Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/Read
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.689Z: Fusing consumer Measure read time into Read from Runner V2 Kafka/Read(KafkaUnboundedSource)/StripIds
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.726Z: Fusing consumer Map records to strings/Map into Measure read time
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:18.765Z: Fusing consumer Counting element into Map records to strings/Map
    Nov 11, 2020 4:08:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:20.082Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 4:08:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:45.803Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 4:08:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:47.662Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 4:08:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:47.698Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 4:08:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:08:58.082Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 11, 2020 4:09:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:09:28.075Z: Workers have started successfully.
    Nov 11, 2020 4:09:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:09:28.104Z: Workers have started successfully.
    Nov 11, 2020 4:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:26:28.889Z: Cleaning up.
    Nov 11, 2020 4:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:26:29.181Z: Stopping **** pool...
    Nov 11, 2020 4:26:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:26:29.256Z: Stopping **** pool...
    Nov 11, 2020 4:27:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:27:21.014Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
    Nov 11, 2020 4:27:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T04:27:21.060Z: Worker pool stopped.
    Nov 11, 2020 4:27:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_20_08_01-2322589840664177372 finished with status DONE.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 FAILED
    java.lang.AssertionError: expected:<100000000> but was:<100000>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2(KafkaIOIT.java:149)

1 test completed, 1 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 11,5,main]) completed. Took 26 mins 10.167 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111035659
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111035659
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b64af2601e6a3da41457b3b7c272065392c3cf4e206c6ffcb73f18d1083858c2
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111035659
Successfully started process 'command 'gcloud''
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b64af2601e6a3da41457b3b7c272065392c3cf4e206c6ffcb73f18d1083858c2
  Associated tags:
 - 20201111035659
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111035659
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111035659].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b64af2601e6a3da41457b3b7c272065392c3cf4e206c6ffcb73f18d1083858c2].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 11,5,main]) completed. Took 3.646 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 11,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 11,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 30m 43s
133 actionable tasks: 98 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4qsirie2a62iy

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1528

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1528/display/redirect>

Changes:


------------------------------------------
[...truncated 725.00 KB...]
    INFO: Uploading 209 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-kwh0vSfUKFn2AXoxoiJL9_Bizt-9aE1V3GjzPtWfMZc.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/main5388295209306683650.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-mS5lURO06B2bMP66zU7-Ws2sXhPY6SXvk9hLScUCoys.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-4ir4GSjcm7bdrE0uDpVe1NnjKC9GOdYOdk5CUunB2Lw.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-u7IZwd_HO_yfd3RCw5nMcQa483kH9_f52HdMcbs9ZWw.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-synthetic-2.27.0-SNAPSHOT-g1dUXt-qN0ZX57XjZuS9zrfJSEGLigJfxU0FZ1z3xQE.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-TXapV1YxJ5lec6D3tCbnxenfEEv25TqfG9fBN8NZT7A.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-HtbmNbDVPoM4rqK4vHTKmKxOiP7auqibepR9FxAC9FU.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-jeHrQlpOYRvStLm0XQKp0WU68gY-85VqPCJJKYHUYZI.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-V3SgKq7rSfViDAHLJW8c5YQSNTVCREou4_goaiFi_8s.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-Pzf_Gm_0_5-diUSK4b0UVcr5aaHYjMMOcFfuVcU7czY.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-1MQLHEeJK_75I279yNzkmQUc5SHg7SbqoV3fLzPjFpg.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-Pz1BeCwXWVzP7pqUqMvdfds6M0K-XjtapNWxc8l2gEw.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-LV7TyCizZ97MWVQ_W9LE-xCHkEmFCahz2GGWolEE-PI.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-O5SRtdkGX8E0AkXg7gDD2OXjEbl7WalrBzcRk5Maq80.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-3Jaowvgc_GSurgs93IJIxstDa1GGx_DHSSO4fo8PMHM.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-8OHsCnIeUJdMU20gpTaPC2QjkqKl2kVYEucLK5zq-Co.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-FTUTKY_705IXiH4why7vGE4hjDqYnjeuSEqM_ewbzjA.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-UpCnoFdV3AtBkyPJEcyGgRkB2cyYS7ic_ho3Q-yqSTU.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2399555446296456878.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-ns1D0f3S69bVfFzsBTdGgWr8OEH95SaSVMHeQuKQcAE.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-FLAKT0g7C4zp9rXceWP5u5DpPgJSdv74BJTgM7pskyM.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-L9XHX7MhrC61WLOddBQLpkpMi2Fep77yqdKEmAS3Qx8.jar
    Nov 11, 2020 2:48:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-fTTQ9a2xaYLvQ3huLviYz6eoS_JnaJMdtqKhEU3Ix4M.jar
    Nov 11, 2020 2:48:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.27.0-SNAPSHOT-Twi1xC0ax4Um-WwR3sT5R0Renx8GDmTid7rX4SjEwTI.jar
    Nov 11, 2020 2:48:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.27.0-SNAPSHOT-EoTtSJLppSkKXx6Q6yfj4vUDE8ANU4GbwlrcLgebC7A.jar
    Nov 11, 2020 2:48:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-rh-bRVwjgrJs_I38TyMNcq2jhajDOJukYXkztF9yfRc.jar
    Nov 11, 2020 2:48:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.27.0-SNAPSHOT-c3N1h2x34jIhejdgEcejfflkfV4zmVZJSoht0G9hn1M.jar
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 26 files newly uploaded in 2 seconds
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s6
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <94545 bytes, hash 55f44c4d0e22e5329f27e8dd00ce5d797869fa86907e00a57b3d241e50b7a6a2> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-VfRMTQ4i5TKfJ-jdAM5deXhp-oaQfgClez0kHlC3pqI.pb
    Nov 11, 2020 2:48:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 11, 2020 2:48:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_18_48_30-12553582351912707351?project=apache-beam-testing
    Nov 11, 2020 2:48:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_18_48_30-12553582351912707351
    Nov 11, 2020 2:48:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_18_48_30-12553582351912707351
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:44.903Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.487Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.523Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.611Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.637Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.676Z: Expanding GroupByKey operations into optimizable parts.
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.739Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.772Z: Fusing consumer Generate records/ParDo(OutputSingleSource) into Generate records/Impulse
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.809Z: Fusing consumer s3/PairWithRestriction into Generate records/ParDo(OutputSingleSource)
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.832Z: Fusing consumer s3/SplitWithSizing into s3/PairWithRestriction
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.879Z: Fusing consumer Measure write time into s3/ProcessElementAndRestrictionWithSizing
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.919Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:45.946Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:46.323Z: Executing operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 2:48:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:48:46.393Z: Starting 5 ****s in us-central1-f...
    Nov 11, 2020 2:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:49:17.247Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 2:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:49:17.291Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Nov 11, 2020 2:49:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:49:27.645Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Nov 11, 2020 2:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:49:41.803Z: Workers have started successfully.
    Nov 11, 2020 2:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:49:41.833Z: Workers have started successfully.
    Nov 11, 2020 2:50:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:50:24.189Z: Finished operation Generate records/Impulse+Generate records/ParDo(OutputSingleSource)+s3/PairWithRestriction+s3/SplitWithSizing
    Nov 11, 2020 2:50:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:50:24.374Z: Executing operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 2:53:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:53:26.175Z: Finished operation s3/ProcessElementAndRestrictionWithSizing+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Nov 11, 2020 2:53:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:53:26.353Z: Cleaning up.
    Nov 11, 2020 2:53:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:53:26.425Z: Stopping **** pool...
    Nov 11, 2020 2:54:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:54:24.407Z: Autoscaling: Resized **** pool from 5 to 0.
    Nov 11, 2020 2:54:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-11T02:54:24.453Z: Worker pool stopped.
    Nov 11, 2020 2:54:31 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_18_48_30-12553582351912707351 finished with status DONE.
    Nov 11, 2020 2:54:31 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Nov 11, 2020 2:54:31 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Nov 11, 2020 2:54:31 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Nov 11, 2020 2:54:31 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 209 files. Enable logging at DEBUG level to see which files will be staged.
    Nov 11, 2020 2:54:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOWithRunnerV2 FAILED
    java.lang.IllegalArgumentException: unable to serialize DoFnWithExecutionInformation{doFn=org.apache.beam.sdk.io.kafka.KafkaIOIT$1@6385b059, mainOutputTag=Tag<org.apache.beam.sdk.values.PCollection.<init>:402#bf09ad1bfb1f38d6>, sideInputMapping={}, schemaInformation=DoFnSchemaInformation{elementConverters=[]}}
        at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at org.apache.beam.runners.core.construction.ParDoTranslation.translateDoFn(ParDoTranslation.java:692)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator$1.translateDoFn(PrimitiveParDoSingleFactory.java:218)
        at org.apache.beam.runners.core.construction.ParDoTranslation.payloadForParDoLike(ParDoTranslation.java:814)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator.payloadForParDoSingle(PrimitiveParDoSingleFactory.java:214)
        at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$PayloadTranslator.translate(PrimitiveParDoSingleFactory.java:163)
        at org.apache.beam.runners.core.construction.PTransformTranslation$KnownTransformPayloadTranslator.translate(PTransformTranslation.java:428)
        at org.apache.beam.runners.core.construction.PTransformTranslation.toProto(PTransformTranslation.java:238)
        at org.apache.beam.runners.core.construction.SdkComponents.registerPTransform(SdkComponents.java:175)
        at org.apache.beam.runners.core.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:87)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:587)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:579)
        at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:239)
        at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:213)
        at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:468)
        at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:927)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:353)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:334)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOWithRunnerV2(KafkaIOIT.java:152)

        Caused by:
        java.io.NotSerializableException: org.apache.beam.sdk.io.kafka.KafkaIOIT
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
            at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
            at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
            ... 21 more

1 test completed, 1 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 6 mins 15.384 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111024359
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111024359
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7e5630c55d2e707d4bc5b98de414ae3ee6ee2a33b4d3eca1ba79714e0877a97
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images delete --force-delete-tags us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111024359
Successfully started process 'command 'gcloud''
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7e5630c55d2e707d4bc5b98de414ae3ee6ee2a33b4d3eca1ba79714e0877a97
  Associated tags:
 - 20201111024359
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111024359
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20201111024359].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7e5630c55d2e707d4bc5b98de414ae3ee6ee2a33b4d3eca1ba79714e0877a97].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 3.438 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 50s
133 actionable tasks: 98 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sg3cosiyvfyqm

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org