You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/01/27 14:33:54 UTC

Build failed in Jenkins: beam_PerformanceTests_SparkReceiver_IO #213

See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/213/display/redirect?page=changes>

Changes:

[relax] update GCP cloud libraries BOM to 26.5.0


------------------------------------------
[...truncated 352.62 KB...]
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/kafka/build/resources/main',> not found
:sdks:java:io:kafka:jar (Thread[Execution **** Thread 7,5,main]) completed. Took 0.039 secs.
work action resolve beam-sdks-java-io-kafka.jar (project :sdks:java:io:kafka) (Thread[Execution **** Thread 3,5,main]) started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileJava (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileJava (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileJava
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileJava' is d8a4d1db996d8d1059a7e0d3dbc3ba58
Task ':runners:google-cloud-dataflow-java:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.073 secs.
Created classpath snapshot for incremental compilation in 0.032 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileJava' with cache key d8a4d1db996d8d1059a7e0d3dbc3ba58
:runners:google-cloud-dataflow-java:compileJava (Thread[included builds,5,main]) completed. Took 32.995 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:classes (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:classes (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:classes
Skipping task ':runners:google-cloud-dataflow-java:classes' as it has no actions.
:runners:google-cloud-dataflow-java:classes (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:jar (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:jar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:jar (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:jar
Caching disabled for task ':runners:google-cloud-dataflow-java:jar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:jar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:jar (Thread[included builds,5,main]) completed. Took 0.06 secs.
work action resolve beam-runners-google-cloud-dataflow-java.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 3,5,main]) started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:compileJava (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:compileJava (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) started.
:runners:google-cloud-dataflow-java:****:compileJava (Thread[Execution **** Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 78776a7e76e06b7d7d851c5791ffba39
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.062 secs.
Created classpath snapshot for incremental compilation in 0.275 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 78776a7e76e06b7d7d851c5791ffba39
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) completed. Took 10.812 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution ****,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution ****,5,main]) completed. Took 0.032 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.

> Task :runners:google-cloud-dataflow-java:****:compileJava
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:compileJava' is 82139db6e6ececd9877d0cc70f04aab6
Task ':runners:google-cloud-dataflow-java:****:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:****:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.2 secs.
Created classpath snapshot for incremental compilation in 0.031 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:compileJava' with cache key 82139db6e6ececd9877d0cc70f04aab6
:runners:google-cloud-dataflow-java:****:compileJava (Thread[Execution **** Thread 3,5,main]) completed. Took 1 mins 11.256 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:classes (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:classes (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:classes (Thread[Execution **** Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:classes
Skipping task ':runners:google-cloud-dataflow-java:****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:classes (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:shadowJar (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:shadowJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:shadowJar (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:shadowJar
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:shadowJar' is 9baf769eb7878186e020b882d5286565
Task ':runners:google-cloud-dataflow-java:****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 15 (includes project)
Total Time: 2.787s [2787ms]
Average Time/Jar: 0.18580000000000002s [185.8ms]
*******************
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:shadowJar' with cache key 9baf769eb7878186e020b882d5286565
:runners:google-cloud-dataflow-java:****:shadowJar (Thread[Execution **** Thread 4,5,main]) completed. Took 4.094 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** Thread 4,5,main]) started.
work action null (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 6,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 3,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:sparkreceiver:2:integrationTest
Downloading https://repo.maven.apache.org/maven2/com/google/cloud/google-cloud-bigquerystorage/2.28.3/google-cloud-bigquerystorage-2.28.3.pom to /home/jenkins/.gradle/.tmp/gradle_download7147193826629539499bin
Downloading https://repo.maven.apache.org/maven2/com/google/api/grpc/proto-google-cloud-bigquerystorage-v1/2.28.3/proto-google-cloud-bigquerystorage-v1-2.28.3.pom to /home/jenkins/.gradle/.tmp/gradle_download5393041708574207593bin
Custom actions are attached to task ':sdks:java:io:sparkreceiver:2:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 1c57faafb1de632d2c253a926e287e65
Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 7'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:guest@35.224.209.183:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.46.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 7'
Successfully started process 'Gradle Test Executor 7'

Gradle Test Executor 7 started executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.46.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 5000000 records were successfully written to RabbitMQ
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - ReadFromSparkReceiverWithOffsetDoFn started reading
    [Test ****] WARN org.apache.beam.runners.dataflow.DataflowRunner - Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
    [Test ****] INFO org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory - No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
    [Test ****] WARN org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. 
    [Test ****] INFO org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory - No stagingLocation provided, falling back to gcpTempLocation
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 426 files. Enable logging at DEBUG level to see which files will be staged.
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading 427 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    [pool-8-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.46.0-SNAPSHOT-wq0zWvhONI8axxgtDkOzsTBUT57aW5LBBFFf8RIaXA8.jar
    [pool-8-thread-10] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test3292946857183401143.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-qSw-KzHHo3K9wu7HGZxqKi4IVnLEr2IGLm57WR7U0fs.jar
    [pool-8-thread-16] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test7755996237971932830.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-YZIOGMS3FLvJI1ItHr3bw81pXJoVuQdhnx9ITNITUJA.jar
    [pool-8-thread-19] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/main8282223938166913876.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-N9J-6uIGGVoHbnxa-ygeu1IbIwKSvYFJltJ24cdYAyA.jar
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Staging files complete: 424 files cached, 3 files newly uploaded in 2 seconds
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
    [pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading <160406 bytes, hash bb0e5bc9b9edd6eb09be8474025565b1ab145a1fffeef71709119d4405f4ba73> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-uw5bybnt1usJvoR0AlVlsasUWh__7vcXCRGdRAX0unM.pb
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step s1
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction as step s2
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction as step s3
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows as step s4
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map as step s5
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements as step s6
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure read time as step s7
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting element as step s8
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow SDK version: 2.46.0-SNAPSHOT
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-27_06_31_50-15140714719367099695?project=apache-beam-testing
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Submitted job: 2023-01-27_06_31_50-15140714719367099695
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2023-01-27_06_31_50-15140714719367099695
    [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:31:59.035Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-ei28. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:32:25.303Z: Worker configuration: e2-standard-4 in us-central1-b.
    [Test ****] ERROR org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:32:31.636Z: Workflow failed. Causes: The quota check has failed., Requested quota metric SSD Total GB is currently unavailable.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:32:36.654Z: Cleaning up.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:32:37.206Z: Worker pool stopped.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-01-27T14:32:38.999Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineJob - Job 2023-01-27_06_31_50-15140714719367099695 failed with status FAILED.
    [Test ****] ERROR org.apache.beam.sdk.testutils.metrics.MetricsReader - Failed to get metric spark_read_element_count, from namespace org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError: expected:<5000000> but was:<-1>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337)

Gradle Test Executor 7 finished executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest

1 test completed, 1 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest>

> Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 9 mins 46.581 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 40s
139 actionable tasks: 96 executed, 41 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/snzygp5i2th62

Stopped 6 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_SparkReceiver_IO #214

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/214/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org