You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/10/27 17:52:56 UTC
Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #4440
See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/4440/display/redirect>
Changes:
------------------------------------------
[...truncated 380.62 KB...]
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:76)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:142)
at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:115)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:185)
... 22 more
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Base64$Encoder.encode(Base64.java:262)
at java.util.Base64$Encoder.encodeToString(Base64.java:315)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.lambda$testJsonWrite$519c2e7c$1(BigQueryIOIT.java:149)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT$$Lambda$160/1444261924.apply(Unknown Source)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite$BatchAndInsertElements.processElement(BatchedStreamingWrite.java:264)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite$BatchAndInsertElements$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:185)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:76)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:142)
at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:115)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
Oct 27, 2022 5:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-10-27T17:46:18.120Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
java.lang.RuntimeException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.insertAll(BigQueryServicesImpl.java:1191)
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.insertAll(BigQueryServicesImpl.java:1250)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite.flushRows(BatchedStreamingWrite.java:403)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite.access$900(BatchedStreamingWrite.java:67)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite$BatchAndInsertElements.finishBundle(BatchedStreamingWrite.java:286)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
Oct 27, 2022 5:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-10-27T17:46:19.654Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:187)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:108)
at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:56)
at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:117)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:218)
at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:169)
at org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:83)
at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:420)
at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:389)
at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:314)
at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:162)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.gcp.bigquery.BatchedStreamingWrite$BatchAndInsertElements$DoFnInvoker.invokeTeardown(Unknown Source)
at org.apache.beam.runners.dataflow.****.DoFnInstanceManagers$ConcurrentQueueInstanceManager.abort(DoFnInstanceManagers.java:100)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.abort(SimpleParDoFn.java:443)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.abort(ParDoOperation.java:64)
at org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:100)
... 11 more
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
Caused by: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:115)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:185)
... 22 more
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.beam.sdk.util.WindowedValue$ValueInGlobalWindow.withValue(WindowedValue.java:263)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:76)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:142)
at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:115)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:185)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:108)
at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:56)
at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:117)
Oct 27, 2022 5:50:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T17:50:26.135Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
Oct 27, 2022 5:50:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2022-10-27T17:50:26.224Z: Workflow failed. Causes: S04:Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these ****s:
testpipeline-jenkins-1027-10271031-2dip-harness-f6r9
Root cause: Work item failed.,
testpipeline-jenkins-1027-10271031-2dip-harness-f6r9
Root cause: The **** lost contact with the service.,
testpipeline-jenkins-1027-10271031-2dip-harness-k16g
Root cause: The **** lost contact with the service.,
testpipeline-jenkins-1027-10271031-2dip-harness-f6r9
Root cause: The **** lost contact with the service.
Oct 27, 2022 5:50:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T17:50:26.301Z: Cleaning up.
Oct 27, 2022 5:50:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T17:50:26.380Z: Stopping **** pool...
Oct 27, 2022 5:52:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T17:52:44.962Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 27, 2022 5:52:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T17:52:45.025Z: Worker pool stopped.
Oct 27, 2022 5:52:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-10-27_10_31_39-9919525325400600105 failed with status FAILED.
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_OUT
Load test results for test (ID): 97ee43f4-5575-4067-bccb-295175807743 and timestamp: 2022-10-27T17:31:23.824000000Z:
Metric: Value:
write_time 49.743
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
java.lang.AssertionError: Values should be different. Actual: FAILED
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failEquals(Assert.java:187)
at org.junit.Assert.assertNotEquals(Assert.java:163)
at org.junit.Assert.assertNotEquals(Assert.java:177)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWrite(BigQueryIOIT.java:193)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testJsonWrite(BigQueryIOIT.java:152)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:136)
Gradle Test Executor 2 finished executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 21 mins 31.083 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 24m 51s
137 actionable tasks: 83 executed, 52 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/pjnjssiyurjqc
Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_BiqQueryIO_Streaming_Performance_Test_Java #4443
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/4443/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #4442
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/4442/display/redirect>
Changes:
------------------------------------------
[...truncated 34.76 MB...]
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
java.lang.ClassCastException: com.google.api.services.bigquery.model.TableRow cannot be cast to org.apache.avro.generic.IndexedRecord
org.apache.avro.generic.GenericData.getField(GenericData.java:697)
org.apache.avro.generic.GenericData.getField(GenericData.java:712)
org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:164)
org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156)
org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
org.apache.beam.sdk.coders.AvroCoder.encode(AvroCoder.java:373)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:110)
org.apache.beam.sdk.values.TimestampedValue$TimestampedValueCoder.encode(TimestampedValue.java:88)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:73)
org.apache.beam.sdk.coders.NullableCoder.encode(NullableCoder.java:63)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:236)
org.apache.beam.runners.core.construction.UnboundedReadFromBoundedSource$BoundedToUnboundedSourceAdapter$CheckpointCoder.encode(UnboundedReadFromBoundedSource.java:215)
org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
org.apache.beam.runners.dataflow.****.StreamingModeExecutionContext.flushState(StreamingModeExecutionContext.java:435)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1455)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
Oct 27, 2022 8:58:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T20:58:45.269Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Oct 27, 2022 8:58:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-10-27T20:58:45.348Z: Worker pool stopped.
Oct 27, 2022 8:58:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-10-27_13_27_26-376340214398896279 finished with status CANCELLED.
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
java.lang.AssertionError: expected:<10485760> but was:<129911420>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testRead(BigQueryIOIT.java:218)
at org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:145)
Gradle Test Executor 2 finished executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
1 test completed, 1 failed
Finished generating test XML results (0.876 secs) into: <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.66 secs) into: <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[included builds,5,main]) completed. Took 44 mins 20.332 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 44m 58s
137 actionable tasks: 82 executed, 53 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/itthlenbnmxnu
Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
beam_BiqQueryIO_Streaming_Performance_Test_Java - Build # 4441 - Aborted!
Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_BiqQueryIO_Streaming_Performance_Test_Java - Build # 4441 - Aborted:
Check console output at https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/4441/ to view the results.