You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/10/23 17:46:43 UTC
Build failed in Jenkins:
beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #1934
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/1934/display/redirect?page=changes>
Changes:
[ehudm] Upgrade BigQuery client from 0.25.0 to 1.6.0
------------------------------------------
[...truncated 30.08 MB...]
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9980
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10814
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10186
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10360
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10498
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10781
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10876
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10180
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11053
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11043
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10651
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10221
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10387
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9910
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9959
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10334
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10399
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10115
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10339
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10750
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10139
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10545
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10130
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10095
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10332
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10528
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10182
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10945
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10942
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10541
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 568 (mapToPair at GroupCombineFunctions.java:56) finished in 0.040 s
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10338
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10487
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10034
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10031
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 575)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10132
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10748
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10052
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10782
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9942
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10585
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10534
[dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_90_piece0 on localhost:35145 in memory (size: 34.1 KB, free: 13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129), which has no missing parents
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10696
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10015
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10304
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_86_piece0 on localhost:35145 in memory (size: 51.2 KB, free: 13.5 GB)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10213
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10215
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10884
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10171
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9923
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9994
[dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_92_piece0 on localhost:35145 in memory (size: 53.3 KB, free: 13.5 GB)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10801
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10673
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10089
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10652
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10012
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10051
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10507
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10991
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10993
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10456
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10882
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10105
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10024
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10489
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10554
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10438
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9962
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10331
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10123
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10910
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9975
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10868
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10394
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10297
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10413
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10722
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10639
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9955
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10003
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10974
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10957
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10904
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10954
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10860
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9954
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10371
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10235
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10523
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10118
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10166
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10857
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10327
[dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_96_piece0 on localhost:35145 in memory (size: 34.1 KB, free: 13.5 GB)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10400
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10663
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_94_piece0 on localhost:35145 in memory (size: 42.9 KB, free: 13.5 GB)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10901
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10393
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11029
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10374
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10762
[dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Removed broadcast_68_piece0 on localhost:35145 in memory (size: 49.3 KB, free: 13.5 GB)
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10102
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10436
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10439
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11023
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10177
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11019
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10524
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10527
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10531
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10931
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10706
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10112
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10041
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10728
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10464
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10990
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10525
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10530
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10631
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10203
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10712
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10242
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9933
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 9929
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 11081
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned accumulator 10938
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110 stored as values in memory (estimated size 238.8 KB, free 13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 56.4 KB, free 13.5 GB)
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:35145 (size: 56.4 KB, free: 13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 110 from broadcast at DAGScheduler.scala:1039
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 575.0 with 4 tasks
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 575.0 (TID 476, localhost, executor driver, partition 0, PROCESS_LOCAL, 8308 bytes)
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 575.0 (TID 477, localhost, executor driver, partition 1, PROCESS_LOCAL, 8308 bytes)
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 575.0 (TID 478, localhost, executor driver, partition 2, PROCESS_LOCAL, 8308 bytes)
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 575.0 (TID 479, localhost, executor driver, partition 3, PROCESS_LOCAL, 8308 bytes)
[Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 575.0 (TID 476)
[Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 575.0 (TID 479)
[Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 575.0 (TID 477)
[Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 575.0 (TID 478)
[Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
[Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
[Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
[Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
[Executor task launch worker for task 477] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
[Executor task launch worker for task 478] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
[Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
[Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
[Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
[Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
[Executor task launch worker for task 479] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
[Executor task launch worker for task 476] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
[Executor task launch worker for task 477] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
[Executor task launch worker for task 478] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
[dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:35145 (size: 4.0 B, free: 13.5 GB)
[dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:35145 (size: 4.0 B, free: 13.5 GB)
[Executor task launch worker for task 479] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
[Executor task launch worker for task 476] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:35145 (size: 4.0 B, free: 13.5 GB)
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:35145 (size: 4.0 B, free: 13.5 GB)
[Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 575.0 (TID 477). 59881 bytes result sent to driver
[Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 575.0 (TID 478). 59881 bytes result sent to driver
[Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 575.0 (TID 476). 59881 bytes result sent to driver
[Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 575.0 (TID 479). 59881 bytes result sent to driver
[task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 575.0 (TID 478) in 18 ms on localhost (executor driver) (1/4)
[task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 575.0 (TID 479) in 18 ms on localhost (executor driver) (2/4)
[task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 575.0 (TID 477) in 19 ms on localhost (executor driver) (3/4)
[task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 575.0 (TID 476) in 19 ms on localhost (executor driver) (4/4)
[task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 575.0, whose tasks have all completed, from pool
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 575 (foreach at UnboundedDataset.java:80) finished in 0.030 s
[streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:80, took 0.157587 s
[JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1540316234500 ms.3 from job set of time 1540316234500 ms
[JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.512 s for time 1540316234500 ms (execution: 0.497 s)
[Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
[Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@682cb91f{/streaming,null,UNAVAILABLE,@Spark}
[Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@2a78bec2{/streaming/batch,null,UNAVAILABLE,@Spark}
[Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@56557f0f{/static/streaming,null,UNAVAILABLE,@Spark}
[Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
[Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@723e477a{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
[Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
[dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
[Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
[Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
[Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
[Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
Gradle Test Executor 285 finished executing tests.
> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-d8c7a78e-e706-49f2-8a83-75804af0ad9a
org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
[Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@52297974{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
[Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
[dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
[Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
[Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
[Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
[dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
[Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
Gradle Test Executor 289 finished executing tests.
> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-5d694189-1e47-45d1-9ef7-ed80e2121987
Finished generating test XML results (0.099 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.101 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 10 mins 28.065 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 15m 40s
41 actionable tasks: 37 executed, 4 from cache
Publishing build scan...
https://gradle.com/s/jb6lsoruuwjgs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Jenkins build is back to normal :
beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #1935
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/1935/display/redirect?page=changes>