You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/10/20 08:58:33 UTC

Build failed in Jenkins: beam_PerformanceTests_SparkReceiver_IO #7

See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/7/display/redirect>

Changes:


------------------------------------------
[...truncated 407.12 KB...]
Class dependency analysis for incremental compilation took 0.038 secs.
Created classpath snapshot for incremental compilation in 0.188 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 62b58cc06d848833fc51774c350144e6
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 7,5,main]) completed. Took 14.76 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution ****,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is e05feaf425b60d9a929b2fe2d4dc4e79
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Compiling with JDK Java compiler API.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution ****,5,main]) started.
work action null (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) completed. Took 0.047 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.236 secs.
Created classpath snapshot for incremental compilation in 0.094 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key e05feaf425b60d9a929b2fe2d4dc4e79
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** Thread 2,5,main]) completed. Took 2 mins 14.822 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is c48aceb8ea93e716b95deeac05519ab4
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 15 (includes project)
Total Time: 2.619s [2619ms]
Average Time/Jar: 0.1746s [174.6ms]
*******************
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key c48aceb8ea93e716b95deeac05519ab4
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 2,5,main]) completed. Took 3.863 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 4,5,main]) started.
work action null (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 7,5,main]) started.
work action null (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 3,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 5,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 10 started executing tests.

> Task :sdks:java:io:sparkreceiver:integrationTest
Custom actions are attached to task ':sdks:java:io:sparkreceiver:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:integrationTest' is df43a658e3a098ed72b2c7aa8089658a
Task ':sdks:java:io:sparkreceiver:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 10'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"600000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:guest@34.122.87.249:5672","--streamName=rabbitMqTestStream","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 10'
Successfully started process 'Gradle Test Executor 10'

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Error during RabbitMQ clean up
    java.net.ConnectException: Connection refused (Connection refused)
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:607)
    	at com.rabbitmq.client.impl.SocketFrameHandlerFactory.create(SocketFrameHandlerFactory.java:60)
    	at com.rabbitmq.client.impl.recovery.RecoveryAwareAMQConnectionFactory.newConnection(RecoveryAwareAMQConnectionFactory.java:62)
    	at com.rabbitmq.client.impl.recovery.AutorecoveringConnection.init(AutorecoveringConnection.java:156)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1106)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1063)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1021)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1182)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.clearRabbitMQ(SparkReceiverIOIT.java:261)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.setup(SparkReceiverIOIT.java:139)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
    	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker$2.run(TestWorker.java:176)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:100)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:60)
    	at org.gradle.process.internal.****.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.run(GradleWorkerMain.java:69)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.main(GradleWorkerMain.java:74)

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Can not write to rabbit Connection refused (Connection refused)

Gradle Test Executor 10 finished executing tests.

> Task :sdks:java:io:sparkreceiver:integrationTest FAILED

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError
        at org.junit.Assert.fail(Assert.java:87)
        at org.junit.Assert.fail(Assert.java:96)
        at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:329)

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Error during RabbitMQ clean up
    java.net.ConnectException: Connection refused (Connection refused)
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:607)
    	at com.rabbitmq.client.impl.SocketFrameHandlerFactory.create(SocketFrameHandlerFactory.java:60)
    	at com.rabbitmq.client.impl.recovery.RecoveryAwareAMQConnectionFactory.newConnection(RecoveryAwareAMQConnectionFactory.java:62)
    	at com.rabbitmq.client.impl.recovery.AutorecoveringConnection.init(AutorecoveringConnection.java:156)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1106)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1063)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1021)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1182)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.clearRabbitMQ(SparkReceiverIOIT.java:261)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.afterClass(SparkReceiverIOIT.java:148)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.RunAfters.invokeMethod(RunAfters.java:46)
    	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker$2.run(TestWorker.java:176)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:100)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:60)
    	at org.gradle.process.internal.****.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.run(GradleWorkerMain.java:69)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.main(GradleWorkerMain.java:74)

1 test completed, 1 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/reports/tests/integrationTest>
:sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 3,5,main]) completed. Took 4.902 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 7s
138 actionable tasks: 135 executed, 2 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/b4otcunppobpy

Stopped 9 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_SparkReceiver_IO #10

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/10/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_SparkReceiver_IO #9

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/9/display/redirect?page=changes>

Changes:

[Moritz Mack] Keep Spark version in a single place only (BeamModulePlugin)

[Andrew Pilloud] Publish Python nexmark metrics to influxdb

[noreply] Issue#23599 Updated dataframe notebook

[noreply] Added a missing line break.

[Andrew Pilloud] Exclude nexmark from codecov, it has no tests

[Robert Bradshaw] Remove numpy C API dep from public declarations.

[Andrew Pilloud] Migrate nexmark to common config for cron jobs

[Kiley Sok] beam-perf

[Kiley Sok] fix

[noreply] [GitHub Actions] - Run RC Validations Workflow  (#23531)

[noreply] Add workflow to update milestone on issue close (#23629)

[noreply] add website page about data processing for ML (#23552)

[noreply] [Go SDK] Dataframe API wrapper  (#23450)

[noreply] [Go SDK]: Adds Automated Python Expansion Service (#23582)

[noreply] Include CombineFn's in __all__ (#23685)

[noreply] Bump google.golang.org/grpc from 1.50.0 to 1.50.1 in /sdks (#23654)

[noreply] [Playground][Frontend] Tags filter for Examples Catalog (#22074)

[noreply] [Go SDK] Extract output coders in expandCrossLanguage (#23641)

[noreply] Python 3.10 support (#23587)

[noreply] Fixes #22192: Avoids nullpointer error. Preserves previous behavior.

[noreply] Deflaking tests for BQ row insertions. These tests were flaky due to

[noreply] Add java 11 home to jenkins test (#23708)

[noreply] enable automatic expansion service (#23699)

[noreply] add expansion service option (#23712)

[noreply] Downgrade container cryptography version to avoid yanked version

[noreply] Update portable runner test timeout (#23696)

[noreply] Merge pull request #23510: Vortex multiplexing streams

[noreply] Io jms fix ack message checkpoint (#22932)

[noreply] [Playground] Examples CD (#23664)

[noreply] Update release instructions in Python 3.10 (#23702)

[noreply] Move Tensorflow Documentation (#23729)

[noreply] Bump golang.org/x/text from 0.3.7 to 0.4.0 in /sdks (#23686)

[noreply] Unit Content markdown styles (#23592) (#23662)

[noreply] Add reopen issue command (#23733)

[noreply] Add example of real time Anomaly Detection using RunInference (#23497)

[noreply] Support TIMESTAMP type in BigQueryIO with BEAM_ROW output type, and in

[Kenneth Knowles] Verify that secondary key coder is deterministic in SortValues

[Chamikara Madhusanka Jayalath] Updating Python dependencies for the 2.43.0 release

[noreply] Add PytorchBatchConverter (#23296)

[noreply] Pin version to grpcio in build-requirements.txt (#23735)

[noreply] Bump up python container versions. (#23716)

[noreply] Reduce log flood in Python PostCommit flink task (#23635)

[noreply] Speed up check on website links (#23737)

[thiagotnunes] tests: fixes SpannerIO unavailable retry test

[noreply] Remove yeandy from reviewers (#23753)

[noreply] Revert bigdataoss version upgrade (#23727)

[Chamikara Madhusanka Jayalath] Moving to 2.44.0-SNAPSHOT on master branch.

[noreply] Update the timeout in ValidatesContainer suite. (#23732)

[riteshghorse] fix lints

[noreply] Update google cloud vision >= 2.0.0 (#23755)

[noreply] Update GcsIO initialization to support converting input parameters to

[noreply] Adds instructions for running the Multi-language Java quickstart from

[Moritz Mack] Remove obsolete sparkRunner task from hadoop-format: not triggered, no

[noreply] Remove Spark2 from Java testing projects (addresses #23728) (#23749)

[noreply] bugfix/wrong-notebook-linl (#23777)

[noreply] [CdapIO] Integration CdapIO with SparkReceiverIO (#22584)

[noreply] Avoid Circular imports related to bigquery_schema_tools (#23731)

[noreply] Use Flink 1.13 for load tests (#23767)

[Kenneth Knowles] Re-enable PubsubTableProviderIT.testSQLSelectsArrayAttributes

[noreply] Remove obsolete native text io translation. (#23549)

[noreply] Eliminate nullness errors from GenerateSequence (#23744)

[noreply] Updated ipywidgets

[noreply] Add logos to case-studies "Also Used By" (#23781)

[noreply] Avoid pickling unstable reference to moved proto classes. (#23739)

[Robert Bradshaw] Unskip test_generated_class_pickle for cloudpickle.

[Kenneth Knowles] Enable checkerframework by default

[noreply] Allow local packages in requirements.txt dependency list. (#23684)

[noreply] Revert "Update BQIO to a single scheduled executor service reduce

[noreply] Updates Python test expansion service to use Cloud Pickle (#23786)

[noreply] Merge pull request #23795: Revert 23234: issue #23794

[noreply] Merge pull request #23556: Forward failed storage-api row inserts to the

[yathu] Bump dataflow java fn container version to  beam-master-20221022

[noreply] Remove unnecessary dependencies from jpms test (#23775)

[noreply] Use Spark 3 job-server as default Spark job-server for PortableRunner

[noreply] Support usage of custom profileName with AWS ProfileCredentialsProvider

[noreply] Migrate examples and maven-archetypes (including Java Quickstart) to

[Moritz Mack] Update remaining pointers to Spark runner to Spark 3 module (addresses

[noreply] Ignoring BigQuery partitions with empty files (#23710)

[noreply] Benchmarking RunInference Example (#23554)

[Kiley Sok] Increate timeout for test pipelines

[noreply] Bump Dataflow python containers to 20221021 (#23807)

[noreply] Allow MoreFutures.allAsList/allAsListWithExceptions to have the passed

[noreply] granting ruslan shamunov triage rights (#23806)

[noreply] Bump google.golang.org/api from 0.99.0 to 0.100.0 in /sdks (#23718)

[noreply] Initial DaskRunner for Beam (#22421)

[noreply] [Website] update PULL_REQUEST_TEMPLATE.md (#23576)

[noreply] [Website] change width of the additional case studies cards (#23824)

[chamikaramj] Adds a dependency to Python Multi-language library to the GCP Bom

[noreply] Support keyed executors in Samza Runner to process bundles for stateful

[Moritz Mack] Remove obsolete code from Spark 3 runner.

[noreply] Fixing Get Started header link (#23490)

[noreply] Bump cloud.google.com/go/bigquery from 1.42.0 to 1.43.0 in /sdks


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6e395412ed4fc23f84b20a27bfd7396495c8477a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6e395412ed4fc23f84b20a27bfd7396495c8477a # timeout=10
Commit message: "Bump cloud.google.com/go/bigquery from 1.42.0 to 1.43.0 in /sdks (#23820)"
 > git rev-list --no-walk fe41855c322e31dee2077d8cf3b95ad2fba85870 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_SparkReceiver_IO] $ /bin/bash -xe /tmp/jenkins8107933820754869393.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a
Fetching cluster endpoint and auth data.
CRITICAL: ACTION REQUIRED: gke-gcloud-auth-plugin, which is needed for continued use of kubectl, was not found or is not executable. Install gke-gcloud-auth-plugin for use with kubectl by following https://cloud.google.com/blog/products/containers-kubernetes/kubectl-auth-changes-in-gke
kubeconfig entry generated for io-datastores.
[beam_PerformanceTests_SparkReceiver_IO] $ /bin/bash -xe /tmp/jenkins2034865232753888432.sh
+ cp /home/jenkins/.kube/config <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9>
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9>

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_SparkReceiver_IO] $ /bin/bash -xe /tmp/jenkins1673360309119181014.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> createNamespace beam-performancetests-sparkreceiver-io-9
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9>
+ KUBERNETES_NAMESPACE=default
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> --namespace=default'
+ createNamespace beam-performancetests-sparkreceiver-io-9
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> create namespace beam-performancetests-sparkreceiver-io-9'
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> create namespace beam-performancetests-sparkreceiver-io-9
namespace/beam-performancetests-sparkreceiver-io-9 created
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
KUBERNETES_NAMESPACE=beam-performancetests-sparkreceiver-io-9

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_SparkReceiver_IO] $ /bin/bash -xe /tmp/jenkins1831701282338587607.sh
+ <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/kubernetes.sh> apply <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/rabbit/rabbitmq.yaml>
+ KUBECONFIG=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9>
+ KUBERNETES_NAMESPACE=beam-performancetests-sparkreceiver-io-9
+ KUBECTL='kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> --namespace=beam-performancetests-sparkreceiver-io-9'
+ apply <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/rabbit/rabbitmq.yaml>
+ eval 'kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> --namespace=beam-performancetests-sparkreceiver-io-9 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/rabbit/rabbitmq.yaml'>
++ kubectl --kubeconfig=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/config-beam-performancetests-sparkreceiver-io-9> --namespace=beam-performancetests-sparkreceiver-io-9 apply -R -f <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/rabbit/rabbitmq.yaml>
error: the path "<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/.test-infra/kubernetes/rabbit/rabbitmq.yaml"> does not exist
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_SparkReceiver_IO #8

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/8/display/redirect>

Changes:


------------------------------------------
[...truncated 404.89 KB...]
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.065 secs.
Created classpath snapshot for incremental compilation in 0.222 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 21b7ae9473f2db1d43a7cdf69387c133
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 6,5,main]) completed. Took 10.836 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[included builds,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[included builds,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[included builds,5,main]) completed. Took 0.037 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 2,5,main]) started.
work action null (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is ebf11aa2b45dfc535375823ac6d7e860
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/src/main/java',> not found
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.252 secs.
Created classpath snapshot for incremental compilation in 0.096 secs.
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache key ebf11aa2b45dfc535375823ac6d7e860
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava (Thread[Execution **** Thread 4,5,main]) completed. Took 2 mins 25.26 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is fe893ac3370253f995352853685896d3
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/resources/main',> not found
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/original_sources_to_package',> not found
*******************
GRADLE SHADOW STATS

Total Jars: 15 (includes project)
Total Time: 2.76s [2760ms]
Average Time/Jar: 0.184s [184.0ms]
*******************
Stored cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key fe893ac3370253f995352853685896d3
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[included builds,5,main]) completed. Took 4.039 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 3,5,main]) started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 4,5,main]) started.
work action null (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:integrationTest (Thread[included builds,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:integrationTest (Thread[included builds,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 3,5,main]) started.
producer locations for task group 0 (Thread[included builds,5,main]) started.
producer locations for task group 0 (Thread[included builds,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 9 started executing tests.

> Task :sdks:java:io:sparkreceiver:integrationTest
Custom actions are attached to task ':sdks:java:io:sparkreceiver:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:integrationTest' is bddc45dbe08d42c8da0a082c91e6ff02
Task ':sdks:java:io:sparkreceiver:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 9'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"600000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:guest@35.238.102.3:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 9'
Successfully started process 'Gradle Test Executor 9'

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Error during RabbitMQ clean up
    java.net.ConnectException: Connection refused (Connection refused)
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:607)
    	at com.rabbitmq.client.impl.SocketFrameHandlerFactory.create(SocketFrameHandlerFactory.java:59)
    	at com.rabbitmq.client.impl.recovery.RecoveryAwareAMQConnectionFactory.newConnection(RecoveryAwareAMQConnectionFactory.java:63)
    	at com.rabbitmq.client.impl.recovery.AutorecoveringConnection.init(AutorecoveringConnection.java:160)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1216)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1173)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1131)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1294)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.clearRabbitMQ(SparkReceiverIOIT.java:261)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.setup(SparkReceiverIOIT.java:139)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
    	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker$2.run(TestWorker.java:176)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:100)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:60)
    	at org.gradle.process.internal.****.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.run(GradleWorkerMain.java:69)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.main(GradleWorkerMain.java:74)

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Can not write to rabbit Connection refused (Connection refused)

Gradle Test Executor 9 finished executing tests.

> Task :sdks:java:io:sparkreceiver:integrationTest FAILED

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError
        at org.junit.Assert.fail(Assert.java:87)
        at org.junit.Assert.fail(Assert.java:96)
        at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:329)

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    [Test ****] ERROR org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - Error during RabbitMQ clean up
    java.net.ConnectException: Connection refused (Connection refused)
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:607)
    	at com.rabbitmq.client.impl.SocketFrameHandlerFactory.create(SocketFrameHandlerFactory.java:59)
    	at com.rabbitmq.client.impl.recovery.RecoveryAwareAMQConnectionFactory.newConnection(RecoveryAwareAMQConnectionFactory.java:63)
    	at com.rabbitmq.client.impl.recovery.AutorecoveringConnection.init(AutorecoveringConnection.java:160)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1216)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1173)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1131)
    	at com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1294)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.clearRabbitMQ(SparkReceiverIOIT.java:261)
    	at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.afterClass(SparkReceiverIOIT.java:148)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.RunAfters.invokeMethod(RunAfters.java:46)
    	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker$2.run(TestWorker.java:176)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:100)
    	at org.gradle.api.internal.tasks.testing.****.TestWorker.execute(TestWorker.java:60)
    	at org.gradle.process.internal.****.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
    	at org.gradle.process.internal.****.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.run(GradleWorkerMain.java:69)
    	at ****.org.gradle.process.internal.****.GradleWorkerMain.main(GradleWorkerMain.java:74)

1 test completed, 1 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/reports/tests/integrationTest>
:sdks:java:io:sparkreceiver:integrationTest (Thread[Execution **** Thread 3,5,main]) completed. Took 5.11 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 7s
138 actionable tasks: 131 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/as6w6oldsdb6g

Stopped 8 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org