You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/18 00:47:33 UTC

Build failed in Jenkins: beam_BiqQueryIO_Streaming_Performance_Test_Java #1142

See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1142/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-7672] dynamically setup acceptable wheel specs according to

[yoshiki.obata] fixup: update error message when parse failed

[yoshiki.obata] fixup: not to abort when wheel spec setup failed

[yoshiki.obata] fixup: not to use m flag at wheel name with python 3.8

[yoshiki.obata] fixup: simplified wheel name setting

[Boyuan Zhang] Insert TruncateSizedRestriction when pipeline starts to drain.

[Boyuan Zhang] Address latest comments.

[Boyuan Zhang] SpotlessApply

[Boyuan Zhang] Fix java build.

[Boyuan Zhang] spotlessApply

[Boyuan Zhang] Only forward split/progress when the only consumer is splittable.

[kcweaver] [BEAM-8244] Don't run external transform tests with pre_optimize=all.

[Ahmet Altay] Relax to matchers to match display data from the specific tests, not

[noreply] [BEAM-8454] Increase timeout and also enable thread stuckness detector.

[noreply] [BEAM-10490] Support read/write ZetaSQL DATE/TIME types from/to BigQuery

[noreply] [BEAM-10526] Use GrpcCleanupRule to use consistent methodology on server

[noreply] [BEAM-9968] Guarantee that outstanding split/progress requests are

[noreply] [BEAM-10420] Add support for per window invocation of


------------------------------------------
[...truncated 253.16 KB...]
    INFO: Uploading <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-M-qBbqkjgj_5qHQ8yu6fwrbB1OM4yFK_OA0TQpNzU4I.jar
    Jul 18, 2020 12:34:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-C3JFCQEL0GC-zpxoO0EOGzR4hmjt-T3wFNI_1I02Wh4.jar
    Jul 18, 2020 12:34:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-6oKVHZ2iayw6AMI-t63NaSx36MG75YxXQ6ArGxYU9JU.jar
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 171 files cached, 25 files newly uploaded in 1 seconds
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) as step s5
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign as step s8
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103009 bytes, hash 48c211ef275107695ece9a647f33e3560f428b3c00260d81b15ff5c9aabca5a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SMIR7ydRB2lezppkfzPjVg9CizwAJg2BsV_1yaq8paI.pb
    Jul 18, 2020 12:34:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 18, 2020 12:34:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-17_17_34_28-13838072973842367060?project=apache-beam-testing
    Jul 18, 2020 12:34:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-17_17_34_28-13838072973842367060
    Jul 18, 2020 12:34:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-07-17_17_34_28-13838072973842367060
    Jul 18, 2020 12:34:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-18T00:34:28.033Z: The requested max number of ****s (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:35.722Z: Worker configuration: n1-standard-1 in us-central1-a.
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.472Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.550Z: Expanding GroupByKey operations into optimizable parts.
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.578Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.705Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.750Z: Fusing consumer Gather time into Read from source
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.788Z: Fusing consumer Map records into Gather time
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.823Z: Fusing consumer Write to BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.859Z: Fusing consumer Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to BQ/PrepareWrite/ParDo(Anonymous)
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.895Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.931Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:36.968Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.006Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.087Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.117Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.153Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Jul 18, 2020 12:34:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.190Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Jul 18, 2020 12:34:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.236Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Jul 18, 2020 12:34:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.637Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 18, 2020 12:34:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.722Z: Starting 5 ****s in us-central1-a...
    Jul 18, 2020 12:34:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.784Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 18, 2020 12:34:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:34:37.959Z: Executing operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 18, 2020 12:35:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:35:09.245Z: Autoscaling: Raised the number of ****s to 3 based on the rate of progress in the currently running stage(s).
    Jul 18, 2020 12:35:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:35:09.289Z: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Jul 18, 2020 12:35:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-18T00:35:10.980Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 18, 2020 12:35:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:35:14.644Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Jul 18, 2020 12:35:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:35:28.797Z: Workers have started successfully.
    Jul 18, 2020 12:35:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:35:28.838Z: Workers have started successfully.
    Jul 18, 2020 12:38:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:38:07.191Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 18, 2020 12:38:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:38:07.287Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 18, 2020 12:38:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:38:07.438Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 18, 2020 12:38:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-18T00:38:07.794Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Jul 18, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-07-18T00:46:54.446Z: An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
    	at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
    	at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:311)
    	at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
    	at org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
    	at org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
    	at org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
    	at org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
    	at org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
    	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
    	at org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
    	at org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
    	at org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
    	at org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
    	at org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
    	at org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
    	at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
    	at org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
    	at org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
    	at org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
    	at org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
    	at org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
    	at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
    	at org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)


org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for ':',5,main]) completed. Took 13 mins 15.207 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 6' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 11s
83 actionable tasks: 56 executed, 27 from cache

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=f4e954ba-6389-4c5d-811b-1e18d60a00a4, currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 7484
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-7484.out.log
----- Last  20 lines from daemon log file - daemon-7484.out.log -----
* What went wrong:
Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 6' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 11s
83 actionable tasks: 56 executed, 27 from cache

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_BiqQueryIO_Streaming_Performance_Test_Java #1143

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1143/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org