You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/10/30 21:19:02 UTC

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1456

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1456/display/redirect>

Changes:


------------------------------------------
GitHub pull request #12572 of commit f78c1196249ce8dbee2d2a11b0a0fcecfada9ba3, no merge conflicts.
Running as SYSTEM
Setting status of f78c1196249ce8dbee2d2a11b0a0fcecfada9ba3 to PENDING with url https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1456/ and message: 'Build started for merge commit.'
Using context: Java KafkaIO Performance Test
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/12572/*:refs/remotes/origin/pr/12572/* # timeout=10
 > git rev-parse refs/remotes/origin/pr/12572/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/12572/merge^{commit} # timeout=10
Checking out Revision 025d9550dd77b986a26046f562414873aa1c20d2 (refs/remotes/origin/pr/12572/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 025d9550dd77b986a26046f562414873aa1c20d2 # timeout=10
Commit message: "Merge f78c1196249ce8dbee2d2a11b0a0fcecfada9ba3 into 168d442314a3bd012eedf2915d1aaef7f4092bdc"
First time build. Skipping changelog.
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins4200177534924115889.sh
+ cp /home/jenkins/.kube/config <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456>

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins5076559053246730520.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a
Fetching cluster endpoint and auth data.
kubeconfig entry generated for io-datastores.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins7775697830940802060.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> createNamespace beam-performancetests-kafka-io-1456
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> --namespace=default'
+ createNamespace beam-performancetests-kafka-io-1456
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> create namespace beam-performancetests-kafka-io-1456'
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> create namespace beam-performancetests-kafka-io-1456
namespace/beam-performancetests-kafka-io-1456 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-1456

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins2762886170319366178.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-1456
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> --namespace=beam-performancetests-kafka-io-1456'
+ apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> --namespace=beam-performancetests-kafka-io-1456 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster'>
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1456> --namespace=beam-performancetests-kafka-io-1456 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
storageclass.storage.k8s.io/kafka-broker unchanged
storageclass.storage.k8s.io/kafka-zookeeper unchanged
clusterrole.rbac.authorization.k8s.io/node-reader unchanged
clusterrolebinding.rbac.authorization.k8s.io/kafka-node-reader configured
role.rbac.authorization.k8s.io/pod-labler created
rolebinding.rbac.authorization.k8s.io/kafka-pod-labler created
configmap/zookeeper-config created
service/pzoo created
service/zookeeper created
statefulset.apps/pzoo created
configmap/broker-config created
service/broker created
service/bootstrap created
statefulset.apps/kafka created
configmap/kafka-config created
job.batch/kafka-config-eff079ec created
Error from server (Invalid): error when creating "<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-0.yml">: Service "outside-0" is invalid: spec.ports[0].nodePort: Invalid value: 32400: provided port is already allocated
Error from server (Invalid): error when creating "<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-1.yml">: Service "outside-1" is invalid: spec.ports[0].nodePort: Invalid value: 32401: provided port is already allocated
Error from server (Invalid): error when creating "<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster/04-outside-services/outside-2.yml">: Service "outside-2" is invalid: spec.ports[0].nodePort: Invalid value: 32402: provided port is already allocated
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Kafka_IO #1461

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1461/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1460

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1460/display/redirect>

Changes:


------------------------------------------
[...truncated 247.04 KB...]
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-B_BoWVbTsjdsWwqEzLL-p1WnsAoyaxXKNw3Gm-gj2B0.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-0YAGEUgAKoRSqUeirZ1YwoKgcltvuy_q_BiQDONCqjU.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-J5ViKZvp8hDx9nYFTUjS5e2cTiEgrDuHi7n640cvEm0.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qcGjyMFsPfMlX5nU3xnhPwIn0kC2zPRrvXaN53Ko_h8.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-OkC7FbN6RuSJU8ge0bIbk8GnfN1R26AfKOiNzu0fjFI.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-xoKDWU95WOw9_LWa0eeLgsncIHvog4wtSIsuNRP-A-U.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-iwMrYS-2_mj8S-AxvqkqMbIblr2MsoQy2WiJWFiAEGA.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-sdks-java-io-synthetic-2.26.0-SNAPSHOT-Xtqg0HcRZSgJnTWcYXzlHdHSVvZIfKFhq1YtDAitWwQ.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-job-management-2.26.0-SNAPSHOT-4R4H-wipu3YfN85p4JluENdnh-Cq4H53L9DprPTATd0.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT-CjTdtTyEO8T6-GS1JQOXhKxi7XY9_NGyXcuBz_Cvh4c.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-pipeline-2.26.0-SNAPSHOT-WSGLwuoOHhOi5an_JYetVDGgthBvdSBEH_jpnG7NmCE.jar
    Oct 30, 2020 10:40:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://dataflow-staging-us-central1-844138762903/temp/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-gBUD68WfdkUvDrt2muRxGdG28aI7z_qMdUbGVG9YRwU.jar
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 177 files cached, 26 files newly uploaded in 1 seconds
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records as step s1
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s2
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s3
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step s4
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <88453 bytes, hash c53df382df65b3bfc19f64c728ef9bd7280149211dac2443905951a974c88989> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-xT3zgt9ls7_Bn2THKO-b1ygBSSEdrCRDkFlRqXTIiYk.pb
    Oct 30, 2020 10:40:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 10:40:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_15_40_05-5346441651832381641?project=apache-beam-testing
    Oct 30, 2020 10:40:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_15_40_05-5346441651832381641
    Oct 30, 2020 10:40:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_15_40_05-5346441651832381641
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:20.803Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.650Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.683Z: Expanding GroupByKey operations into optimizable parts.
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.715Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.799Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.835Z: Fusing consumer Measure write time into Generate records
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.860Z: Fusing consumer Write to Kafka/Kafka ProducerRecord/Map into Measure write time
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:21.892Z: Fusing consumer Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) into Write to Kafka/Kafka ProducerRecord/Map
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:22.366Z: Executing operation Generate records+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Oct 30, 2020 10:40:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:22.453Z: Starting 5 ****s in us-central1-f...
    Oct 30, 2020 10:40:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:40:27.440Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 30, 2020 10:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:36.741Z: Autoscaling: Raised the number of ****s to 1 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 10:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:36.954Z: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 10:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:41.011Z: Workers have started successfully.
    Oct 30, 2020 10:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:42.275Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 10:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:42.307Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 10:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:47.616Z: Autoscaling: Raised the number of ****s to 4 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 10:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:47.649Z: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 10:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:45:49.006Z: Workers have started successfully.
    Oct 30, 2020 10:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:47:49.279Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 10:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:49:00.395Z: Finished operation Generate records+Measure write time+Write to Kafka/Kafka ProducerRecord/Map+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)
    Oct 30, 2020 10:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:49:00.593Z: Cleaning up.
    Oct 30, 2020 10:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:49:00.658Z: Stopping **** pool...
    Oct 30, 2020 10:54:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:54:10.266Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 30, 2020 10:54:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T22:54:10.306Z: Worker pool stopped.
    Oct 30, 2020 10:54:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_15_40_05-5346441651832381641 finished with status DONE.
    Oct 30, 2020 10:54:16 PM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    Oct 30, 2020 10:54:17 PM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    Oct 30, 2020 10:54:17 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 30, 2020 10:54:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 202 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 30, 2020 10:54:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 30, 2020 10:54:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 30, 2020 10:54:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT-CjTdtTyEO8T6-GS1JQOXhKxi7XY9_NGyXcuBz_Cvh4c.jar
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/Impulse as step s1
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/ParDo(GenerateKafkaSourceDescriptor) as step s2
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka) as step s3
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/Reshuffle.ViaRandomKey/Pair with random key as step s4
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign as step s5
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey as step s6
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable as step s7
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/Reshuffle.ViaRandomKey/Values/Values/Map as step s8
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/KafkaCommitOffset/MapElements/Map as step s9
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/KafkaCommitOffset/Window.Into()/Window.Assign as step s10
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/KafkaCommitOffset/Combine.perKey(MaxLong)/GroupByKey as step s11
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/KafkaCommitOffset/Combine.perKey(MaxLong)/Combine.GroupedValues as step s12
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/KafkaCommitOffset/ParDo(CommitOffset) as step s13
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map as step s14
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s15
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s16
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step s17
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s18
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Impulse as step s19
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/ParDo(DecodeAndEmit) as step s20
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s21
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as step s22
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s23
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s24
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s25
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s26
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s27
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s28
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s29
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <148605 bytes, hash 3673e76d561c58d8da39b59245bcf26f81f446355faa11aa4572fdfa9e217755> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-NnPnbVYcWNjaObWSRbzyb4H0RjVfqhGqRXL9-p4hd1U.pb
    Oct 30, 2020 10:54:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 10:54:19 PM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs. 

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testSDFKafkaIORead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: (fff20f41ab66c3db): The workflow could not be created. Causes: (fff20f41ab66c998): An internal service error occurred. Please contact customer support. Please include the message id if one is provided.
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1127)
        at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:320)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:351)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:332)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.runKafkaTestPipeline(KafkaIOIT.java:139)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testSDFKafkaIORead(KafkaIOIT.java:160)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
        {
          "code" : 400,
          "errors" : [ {
            "domain" : "global",
            "message" : "(fff20f41ab66c3db): The workflow could not be created. Causes: (fff20f41ab66c998): An internal service error occurred. Please contact customer support. Please include the message id if one is provided.",
            "reason" : "badRequest"
          } ],
          "message" : "(fff20f41ab66c3db): The workflow could not be created. Causes: (fff20f41ab66c998): An internal service error occurred. Please contact customer support. Please include the message id if one is provided.",
          "status" : "INVALID_ARGUMENT"
        }
            at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:149)
            at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:112)
            at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:39)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:443)
            at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1108)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:541)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:474)
            at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:591)
            at org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:62)
            at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1113)
            ... 6 more

1 test completed, 1 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 14 mins 25.209 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 11s
85 actionable tasks: 52 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/rk6jbemdegnza

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1459

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1459/display/redirect>

Changes:


------------------------------------------
[...truncated 221.81 KB...]
> Task :sdks:java:io:kafka:compileTestJava
Custom actions are attached to task ':sdks:java:io:kafka:compileTestJava'.
Build cache key for task ':sdks:java:io:kafka:compileTestJava' is ce9a3954c90b5b8b9f370e5f5b7a023a
Task ':sdks:java:io:kafka:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':sdks:java:io:kafka:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileContent/java-modules.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileContent/annotation-processors.bin
Starting process 'Gradle Worker Daemon 1'. Working directory: /home/jenkins/.gradle/****s Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar -Xmx512m -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Worker Daemon 1'
Successfully started process 'Gradle Worker Daemon 1'

> Task :sdks:java:io:google-cloud-platform:jar
Caching disabled for task ':sdks:java:io:google-cloud-platform:jar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:jar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/resources/main',> not found
:sdks:java:io:google-cloud-platform:jar (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 1.089 secs.
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':' Thread 8,5,main]) started.

> Task :runners:direct-java:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/resources/main'.>
Custom actions are attached to task ':runners:direct-java:shadowJar'.
Caching disabled for task ':runners:direct-java:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/direct-java/build/resources/main',> not found

> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileJava' is 163e0d473928aa3efd832c8eb8126266
Task ':runners:google-cloud-dataflow-java:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileJava' with cache key 163e0d473928aa3efd832c8eb8126266
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':' Thread 8,5,main]) completed. Took 0.953 secs.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:classes
Skipping task ':runners:google-cloud-dataflow-java:classes' as it has no actions.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:jar
Caching disabled for task ':runners:google-cloud-dataflow-java:jar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:jar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 0.419 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 2,5,main]) started.

> Task :sdks:java:io:kafka:compileTestJava
Started Gradle **** daemon (3.069 secs) with fork options DaemonForkOptions{executable=/usr/lib/jvm/java-8-openjdk-amd64/bin/java, minHeapSize=null, maxHeapSize=null, jvmArgs=[-Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar], keepAliveMode=SESSION}.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava FROM-CACHE
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 871845356925bc3579a3025bbc73f2d0
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 871845356925bc3579a3025bbc73f2d0
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 2,5,main]) completed. Took 1.367 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 4,5,main]) started.
This JVM does not support getting OS memory, so no OS memory status updates will be broadcast

> Task :runners:direct-java:shadowJar
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 3.935s [3935ms]
Average Time/Jar: 0.6558333333333s [655.8333333333ms]
*******************
:runners:direct-java:shadowJar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 5.701 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:kafka:compileTestJava
Compiling with JDK Java compiler API.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is dd99e219afe04ca113fa9d38d238fd81
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key dd99e219afe04ca113fa9d38d238fd81
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 1.73 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main'.>
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package'.>
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Caching disabled for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' Thread 5,5,main]) completed. Took 1.262 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 7d1e393fd53afd27e82a6538cef72458
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 7d1e393fd53afd27e82a6538cef72458
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.886 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.>
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 5,5,main]) completed. Took 0.105 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 10.508s [10508ms]
Average Time/Jar: 0.65675s [656.75ms]
*******************
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 15.401 secs.

> Task :sdks:java:io:kafka:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 4.949 secs. 910 duplicate classes found in classpath (see all with --debug).
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':sdks:java:io:kafka:compileTestJava' with cache key ce9a3954c90b5b8b9f370e5f5b7a023a
:sdks:java:io:kafka:compileTestJava (Thread[Execution **** for ':' Thread 9,5,main]) completed. Took 27.735 secs.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 4,5,main]) started.

> Task :sdks:java:io:kafka:testClasses
Skipping task ':sdks:java:io:kafka:testClasses' as it has no actions.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 4,5,main]) started.

> Task :sdks:java:io:kafka:integrationTest
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is b7280a14e2820fc018d51b52da0fc284
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results","--influxMeasurement=kafkaioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=35.225.34.64:32400,34.71.138.60:32401,34.67.34.122:32402","--kafkaTopic=beam","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

Gradle Test Executor 2 started executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT > testSDFKafkaIORead FAILED
    java.lang.IllegalStateException: GroupByKey cannot be applied to non-bounded PCollection in the GlobalWindow without a trigger. Use a Window.into or Window.triggering transform prior to GroupByKey.
        at org.apache.beam.sdk.transforms.GroupByKey.applicableTo(GroupByKey.java:156)
        at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:228)
        at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:479)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
        at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1622)
        at org.apache.beam.sdk.transforms.Combine$PerKey.expand(Combine.java:1511)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:479)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:355)
        at org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1187)
        at org.apache.beam.sdk.transforms.Combine$Globally.expand(Combine.java:1071)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.runKafkaTestPipeline(KafkaIOIT.java:132)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testSDFKafkaIORead(KafkaIOIT.java:160)

1 test completed, 1 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.089 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>

> Task :sdks:java:io:kafka:integrationTest FAILED
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 13.044 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
85 actionable tasks: 52 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/z542sca3bin2k

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1458

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1458/display/redirect>

Changes:


------------------------------------------
[...truncated 224.33 KB...]

> Task :sdks:java:io:kafka:testClasses
Skipping task ':sdks:java:io:kafka:testClasses' as it has no actions.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 4,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:google-cloud-platform:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.119 secs. 1 duplicate classes found in classpath (see all with --debug).
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':sdks:java:io:google-cloud-platform:compileJava' with cache key 3dc18f63980fd28acc0367456bee2cbc
:sdks:java:io:google-cloud-platform:compileJava (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 1 mins 18.161 secs.
:sdks:java:io:google-cloud-platform:classes (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:classes
Skipping task ':sdks:java:io:google-cloud-platform:classes' as it has no actions.
:sdks:java:io:google-cloud-platform:classes (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:jar (Thread[Execution **** for ':' Thread 3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:jar
Caching disabled for task ':sdks:java:io:google-cloud-platform:jar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:jar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/google-cloud-platform/build/resources/main',> not found
:sdks:java:io:google-cloud-platform:jar (Thread[Execution **** for ':' Thread 3,5,main]) completed. Took 0.869 secs.
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':',5,main]) started.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileJava
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileJava' is 163e0d473928aa3efd832c8eb8126266
Task ':runners:google-cloud-dataflow-java:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.449 secs. 1 duplicate classes found in classpath (see all with --debug).
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileJava' with cache key 163e0d473928aa3efd832c8eb8126266
:runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** for ':',5,main]) completed. Took 24.734 secs.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:classes
Skipping task ':runners:google-cloud-dataflow-java:classes' as it has no actions.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is dd99e219afe04ca113fa9d38d238fd81
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with JDK Java compiler API.

> Task :runners:google-cloud-dataflow-java:jar
Caching disabled for task ':runners:google-cloud-dataflow-java:jar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:jar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:jar (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 0.172 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 871845356925bc3579a3025bbc73f2d0
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Compiling with JDK Java compiler API.

> Task :sdks:java:io:google-cloud-platform:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.734 secs. 910 duplicate classes found in classpath (see all with --debug).
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key dd99e219afe04ca113fa9d38d238fd81
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** for ':' Thread 6,5,main]) completed. Took 38.106 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** for ':',5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':',5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':',5,main]) completed. Took 0.652 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 7d1e393fd53afd27e82a6538cef72458
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.267 secs. 910 duplicate classes found in classpath (see all with --debug).
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 7d1e393fd53afd27e82a6538cef72458
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':',5,main]) completed. Took 11.46 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':',5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.>
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':',5,main]) completed. Took 0.057 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 0.272 secs. 316 duplicate classes found in classpath (see all with --debug).
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key 871845356925bc3579a3025bbc73f2d0
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 32.531 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main'.>
Could not read file path '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package'.>
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Caching disabled for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 6.413s [6413ms]
Average Time/Jar: 0.4008125s [400.8125ms]
*******************
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 8.004 secs.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :sdks:java:io:kafka:integrationTest
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/fileHashes.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/6.6.1/fileHashes/resourceHashesCache.bin
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is 53c6fdaa56eaba0353b0481b40d7f816
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results","--influxMeasurement=kafkaioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=34.71.24.18:32400,34.123.255.10:32401,35.226.3.98:32402","--kafkaTopic=beam","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.6.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

Gradle Test Executor 3 started executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.26.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT > testSDFKafkaIORead FAILED
    java.lang.IllegalArgumentException: commitOffsetsInFinalize() is enabled, but group.id in Kafka consumer config is not set. Offset management requires group.id.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:141)
        at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:1030)
        at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:478)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:545)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
        at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:189)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.runKafkaTestPipeline(KafkaIOIT.java:127)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testSDFKafkaIORead(KafkaIOIT.java:158)

1 test completed, 1 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.041 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>

> Task :sdks:java:io:kafka:integrationTest FAILED
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 8.922 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 35s
85 actionable tasks: 58 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/36ducq5ctojiw

Stopped 2 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #1457

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1457/display/redirect>

Changes:


------------------------------------------
GitHub pull request #13176 of commit 99348e33c7fedfff1ae76f017d0e4cd2df95efd3, no merge conflicts.
Running as SYSTEM
Setting status of 99348e33c7fedfff1ae76f017d0e4cd2df95efd3 to PENDING with url https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/1457/ and message: 'Build started for merge commit.'
Using context: Java KafkaIO Performance Test
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/13176/*:refs/remotes/origin/pr/13176/* # timeout=10
 > git rev-parse refs/remotes/origin/pr/13176/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/13176/merge^{commit} # timeout=10
Checking out Revision 6632cddb8277ba4b36cabc91d87b8260d2c35ab6 (refs/remotes/origin/pr/13176/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6632cddb8277ba4b36cabc91d87b8260d2c35ab6 # timeout=10
Commit message: "Merge 99348e33c7fedfff1ae76f017d0e4cd2df95efd3 into 168d442314a3bd012eedf2915d1aaef7f4092bdc"
First time build. Skipping changelog.
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins1527021704595916680.sh
+ cp /home/jenkins/.kube/config <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457>

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins6080530054434994486.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a
Fetching cluster endpoint and auth data.
kubeconfig entry generated for io-datastores.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins722615505149601376.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> createNamespace beam-performancetests-kafka-io-1457
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> --namespace=default'
+ createNamespace beam-performancetests-kafka-io-1457
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> create namespace beam-performancetests-kafka-io-1457'
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> create namespace beam-performancetests-kafka-io-1457
namespace/beam-performancetests-kafka-io-1457 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-1457

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Kafka_IO] $ /bin/bash -xe /tmp/jenkins1168598673248836346.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457>
+ KUBERNETES_NAMESPACE=beam-performancetests-kafka-io-1457
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> --namespace=beam-performancetests-kafka-io-1457'
+ apply <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> --namespace=beam-performancetests-kafka-io-1457 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster'>
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/config-beam-performancetests-kafka-io-1457> --namespace=beam-performancetests-kafka-io-1457 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/.test-infra/kubernetes/kafka-cluster>
storageclass.storage.k8s.io/kafka-broker unchanged
storageclass.storage.k8s.io/kafka-zookeeper unchanged
clusterrole.rbac.authorization.k8s.io/node-reader unchanged
clusterrolebinding.rbac.authorization.k8s.io/kafka-node-reader configured
role.rbac.authorization.k8s.io/pod-labler created
rolebinding.rbac.authorization.k8s.io/kafka-pod-labler created
configmap/zookeeper-config created
service/pzoo created
service/zookeeper created
statefulset.apps/pzoo created
service/outside-1 created
service/outside-2 created
configmap/broker-config created
service/broker created
service/bootstrap created
statefulset.apps/kafka created
configmap/kafka-config created
job.batch/kafka-config-eff079ec created
The Service "outside-0" is invalid: spec.ports[0].nodePort: Invalid value: 32400: provided port is already allocated
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org