You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/11/12 22:23:07 UTC

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2178

See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2178/display/redirect?page=changes>

Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

------------------------------------------
[...truncated 28.58 MB...]
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:79
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2824 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 832, ShuffleMapStage 826, ShuffleMapStage 836, ShuffleMapStage 830, ShuffleMapStage 822, ShuffleMapStage 834, ShuffleMapStage 838, ShuffleMapStage 824, ShuffleMapStage 828)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 822)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 818 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:35693 (size: 54.7 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 143 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 818 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 818.0 with 4 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 818.0 (TID 642, localhost, executor driver, partition 0, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 818.0 (TID 643, localhost, executor driver, partition 1, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 818.0 (TID 644, localhost, executor driver, partition 2, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 818.0 (TID 645, localhost, executor driver, partition 3, PROCESS_LOCAL, 8165 bytes)
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 818.0 (TID 644)
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 818.0 (TID 643)
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 818.0 (TID 642)
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 818.0 (TID 645)
    [Executor task launch worker for task 644] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
    [Executor task launch worker for task 643] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
    [Executor task launch worker for task 645] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
    [Executor task launch worker for task 642] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 818.0 (TID 643). 59466 bytes result sent to driver
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 818.0 (TID 644). 59466 bytes result sent to driver
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 818.0 (TID 642). 59509 bytes result sent to driver
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 818.0 (TID 645). 59509 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 818.0 (TID 643) in 12 ms on localhost (executor driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 818.0 (TID 644) in 12 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 818.0 (TID 642) in 13 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 818.0 (TID 645) in 12 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 818.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 818 (mapToPair at GroupCombineFunctions.java:57) finished in 0.019 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 822, ShuffleMapStage 821, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 821 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:35693 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 144 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 821 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 821.0 with 5 tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 821.0 (TID 646, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 821.0 (TID 647, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 821.0 (TID 648, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 821.0 (TID 649, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes)
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 821.0 (TID 646)
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 821.0 (TID 647)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 821.0 (TID 649)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 821.0 (TID 648)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 821.0 (TID 648). 59940 bytes result sent to driver
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 821.0 (TID 647). 59940 bytes result sent to driver
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 821.0 (TID 649). 59940 bytes result sent to driver
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 821.0 (TID 646). 59940 bytes result sent to driver
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 821.0 (TID 650, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 821.0 (TID 650)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 821.0 (TID 648) in 14 ms on localhost (executor driver) (1/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 821.0 (TID 649) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 821.0 (TID 646) in 15 ms on localhost (executor driver) (3/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 821.0 (TID 647) in 15 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 821.0 (TID 650). 59424 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 821.0 (TID 650) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 821.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 821 (mapToPair at GroupCombineFunctions.java:57) finished in 0.033 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 822, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 822 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:35693 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 822 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 822.0 with 5 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 822.0 (TID 651, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 822.0 (TID 652, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 822.0 (TID 653, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 822.0 (TID 654, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 822.0 (TID 651)
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 822.0 (TID 654)
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 822.0 (TID 653)
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 822.0 (TID 652)
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 822.0 (TID 654). 59939 bytes result sent to driver
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 822.0 (TID 653). 59896 bytes result sent to driver
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 822.0 (TID 652). 59896 bytes result sent to driver
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 822.0 (TID 655, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 822.0 (TID 655)
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 822.0 (TID 651). 59896 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 822.0 (TID 654) in 12 ms on localhost (executor driver) (1/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 822.0 (TID 652) in 12 ms on localhost (executor driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 822.0 (TID 653) in 13 ms on localhost (executor driver) (3/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 822.0 (TID 651) in 14 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 822.0 (TID 655). 59853 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 822.0 (TID 655) in 11 ms on localhost (executor driver) (5/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 822.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 822 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.032 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:35693 (size: 58.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, partition 0, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, partition 1, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, partition 2, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, partition 3, PROCESS_LOCAL, 8132 bytes)
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658)
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656)
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659)
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657)
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:35693 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 658). 59881 bytes result sent to driver
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 656). 59881 bytes result sent to driver
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 657). 59881 bytes result sent to driver
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 659). 59881 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 839.0 (TID 658) in 12 ms on localhost (executor driver) (1/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 839.0 (TID 656) in 12 ms on localhost (executor driver) (2/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 839.0 (TID 657) in 12 ms on localhost (executor driver) (3/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 839.0 (TID 659) in 12 ms on localhost (executor driver) (4/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 839.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.019 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:79, took 0.111828 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542060819000 ms.3 from job set of time 1542060819000 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.294 s for time 1542060819000 ms (execution: 0.462 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@525b9863{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@2b6c55aa{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@284404c5{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@c30a4b0{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 291 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cdc4b58e-417e-4590-bede-f3eab62cb5a4

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@26d80717{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-b914c293-c503-4c8d-9c83-29e8f3e6bcdb
Finished generating test XML results (0.128 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.106 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 10 mins 27.355 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 25s
43 actionable tasks: 40 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/fov4h7ngbm3vm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2184

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2184/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2183

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2183/display/redirect?page=changes>

Changes:

[github] Clarify in docstrings that we expect  TFRecord values to be bytes

------------------------------------------
[...truncated 29.36 MB...]
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:79
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2824 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 828, ShuffleMapStage 832, ShuffleMapStage 826, ShuffleMapStage 836, ShuffleMapStage 830, ShuffleMapStage 834, ShuffleMapStage 838, ShuffleMapStage 821, ShuffleMapStage 810)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 826)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 824 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:36139 (size: 54.7 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 143 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 824 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 824.0 with 4 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 824.0 (TID 642, localhost, executor driver, partition 0, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 824.0 (TID 643, localhost, executor driver, partition 1, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 824.0 (TID 644, localhost, executor driver, partition 2, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 824.0 (TID 645, localhost, executor driver, partition 3, PROCESS_LOCAL, 8165 bytes)
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 824.0 (TID 642)
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 824.0 (TID 644)
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 824.0 (TID 643)
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 824.0 (TID 645)
    [Executor task launch worker for task 643] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
    [Executor task launch worker for task 645] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
    [Executor task launch worker for task 642] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
    [Executor task launch worker for task 644] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 824.0 (TID 645). 59509 bytes result sent to driver
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 824.0 (TID 643). 59509 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 824.0 (TID 645) in 11 ms on localhost (executor driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 824.0 (TID 643) in 11 ms on localhost (executor driver) (2/4)
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 824.0 (TID 644). 59509 bytes result sent to driver
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 824.0 (TID 642). 59509 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 824.0 (TID 644) in 15 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 824.0 (TID 642) in 15 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 824.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 824 (mapToPair at GroupCombineFunctions.java:57) finished in 0.022 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 825, ShuffleMapStage 826, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 825 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:36139 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 144 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 825 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 825.0 with 5 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 825.0 (TID 646, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 825.0 (TID 647, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 825.0 (TID 648, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 825.0 (TID 649, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes)
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 825.0 (TID 646)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 825.0 (TID 648)
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 825.0 (TID 647)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 825.0 (TID 649)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 825.0 (TID 649). 59940 bytes result sent to driver
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 825.0 (TID 646). 59940 bytes result sent to driver
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 825.0 (TID 648). 59940 bytes result sent to driver
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 825.0 (TID 650, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 825.0 (TID 650)
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 825.0 (TID 647). 59940 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 825.0 (TID 649) in 13 ms on localhost (executor driver) (1/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 825.0 (TID 647) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 825.0 (TID 646) in 15 ms on localhost (executor driver) (3/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 825.0 (TID 648) in 14 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 825.0 (TID 650). 59424 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 825.0 (TID 650) in 14 ms on localhost (executor driver) (5/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 825.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 825 (mapToPair at GroupCombineFunctions.java:57) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 826, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 826 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:36139 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 826 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 826.0 with 5 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 826.0 (TID 651, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 826.0 (TID 652, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 826.0 (TID 653, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 826.0 (TID 654, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 826.0 (TID 651)
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 826.0 (TID 653)
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 826.0 (TID 654)
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 826.0 (TID 652)
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 826.0 (TID 654). 59896 bytes result sent to driver
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 826.0 (TID 652). 59853 bytes result sent to driver
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 826.0 (TID 651). 59896 bytes result sent to driver
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 826.0 (TID 653). 59896 bytes result sent to driver
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 826.0 (TID 655, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 826.0 (TID 655)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 826.0 (TID 654) in 13 ms on localhost (executor driver) (1/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 826.0 (TID 653) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 826.0 (TID 652) in 14 ms on localhost (executor driver) (3/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 826.0 (TID 651) in 14 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 826.0 (TID 655). 59853 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 826.0 (TID 655) in 12 ms on localhost (executor driver) (5/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 826.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 826 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.031 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:36139 (size: 58.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, partition 0, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, partition 1, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, partition 2, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, partition 3, PROCESS_LOCAL, 8132 bytes)
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656)
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659)
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657)
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:36139 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 657). 59881 bytes result sent to driver
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 659). 59881 bytes result sent to driver
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 656). 59881 bytes result sent to driver
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 658). 59881 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 839.0 (TID 657) in 13 ms on localhost (executor driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 839.0 (TID 656) in 13 ms on localhost (executor driver) (2/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 839.0 (TID 659) in 14 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 839.0 (TID 658) in 15 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 839.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.022 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:79, took 0.118111 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542073210000 ms.3 from job set of time 1542073210000 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.488 s for time 1542073210000 ms (execution: 0.562 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@69069e4c{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@f992f72{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@2a4f5f5f{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@4201964f{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 289 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-8f3b7e02-00a4-47b8-9070-48ce47b9a720

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@58809b2e{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-debc74ed-8618-4819-bebb-c70ae93286bb
Finished generating test XML results (0.124 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.115 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 10 mins 27.965 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 24s
43 actionable tasks: 39 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/v2kh6yu6iac6a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2182

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2182/display/redirect?page=changes>

Changes:

[huangry] Update worker container version to most recent release.

------------------------------------------
[...truncated 29.34 MB...]
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:79
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2472 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2500 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2509 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 31 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 667 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 658, ShuffleMapStage 662, ShuffleMapStage 651, ShuffleMapStage 666, ShuffleMapStage 660, ShuffleMapStage 637, ShuffleMapStage 664, ShuffleMapStage 653)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 658)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127 stored as values in memory (estimated size 177.5 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:43437 (size: 54.7 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 127 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 656.0 with 4 tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 656.0 (TID 570, localhost, executor driver, partition 0, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 656.0 (TID 571, localhost, executor driver, partition 1, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 656.0 (TID 572, localhost, executor driver, partition 2, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 656.0 (TID 573, localhost, executor driver, partition 3, PROCESS_LOCAL, 8127 bytes)
    [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 656.0 (TID 571)
    [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 656.0 (TID 573)
    [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 656.0 (TID 570)
    [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 656.0 (TID 572)
    [Executor task launch worker for task 573] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_3 locally
    [Executor task launch worker for task 572] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally
    [Executor task launch worker for task 570] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally
    [Executor task launch worker for task 571] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_1 locally
    [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 656.0 (TID 573). 59509 bytes result sent to driver
    [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 656.0 (TID 571). 59466 bytes result sent to driver
    [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 656.0 (TID 570). 59509 bytes result sent to driver
    [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 656.0 (TID 572). 59509 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 656.0 (TID 571) in 14 ms on localhost (executor driver) (1/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 656.0 (TID 572) in 14 ms on localhost (executor driver) (2/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 656.0 (TID 570) in 14 ms on localhost (executor driver) (3/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 656.0 (TID 573) in 14 ms on localhost (executor driver) (4/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 656.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 656 (mapToPair at GroupCombineFunctions.java:57) finished in 0.023 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667, ShuffleMapStage 657)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128 stored as values in memory (estimated size 211.7 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 63.2 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:43437 (size: 63.2 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 128 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 657.0 with 5 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 657.0 (TID 574, localhost, executor driver, partition 0, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 657.0 (TID 575, localhost, executor driver, partition 1, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 657.0 (TID 576, localhost, executor driver, partition 2, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 657.0 (TID 577, localhost, executor driver, partition 3, PROCESS_LOCAL, 8376 bytes)
    [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 657.0 (TID 574)
    [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 657.0 (TID 575)
    [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 657.0 (TID 576)
    [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 657.0 (TID 577)
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_2 locally
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_1 locally
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_0 locally
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_3 locally
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_3 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_0 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_1 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_2 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 657.0 (TID 577). 59940 bytes result sent to driver
    [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 657.0 (TID 576). 59940 bytes result sent to driver
    [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 657.0 (TID 574). 59940 bytes result sent to driver
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 657.0 (TID 578, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 657.0 (TID 578)
    [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 657.0 (TID 575). 59940 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 657.0 (TID 577) in 17 ms on localhost (executor driver) (1/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 657.0 (TID 576) in 17 ms on localhost (executor driver) (2/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 657.0 (TID 574) in 18 ms on localhost (executor driver) (3/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 657.0 (TID 575) in 18 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 657.0 (TID 578). 59467 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 657.0 (TID 578) in 15 ms on localhost (executor driver) (5/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 657.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 657 (mapToPair at GroupCombineFunctions.java:57) finished in 0.039 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129 stored as values in memory (estimated size 213.0 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 63.3 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:43437 (size: 63.3 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 658.0 with 5 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 658.0 (TID 579, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 658.0 (TID 580, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 658.0 (TID 581, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 658.0 (TID 582, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 658.0 (TID 579)
    [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 658.0 (TID 580)
    [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 658.0 (TID 581)
    [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 658.0 (TID 582)
    [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 658.0 (TID 579). 59896 bytes result sent to driver
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 658.0 (TID 583, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 658.0 (TID 583)
    [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 658.0 (TID 579) in 14 ms on localhost (executor driver) (1/5)
    [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 658.0 (TID 581). 59896 bytes result sent to driver
    [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 658.0 (TID 582). 59896 bytes result sent to driver
    [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 658.0 (TID 580). 59896 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 658.0 (TID 582) in 17 ms on localhost (executor driver) (2/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 658.0 (TID 581) in 19 ms on localhost (executor driver) (3/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 658.0 (TID 580) in 19 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 658.0 (TID 583). 59896 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 658.0 (TID 583) in 14 ms on localhost (executor driver) (5/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 658.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 658 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.036 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 667)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130 stored as values in memory (estimated size 187.9 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 57.8 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:43437 (size: 57.8 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 667.0 with 4 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 667.0 (TID 584, localhost, executor driver, partition 0, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 667.0 (TID 585, localhost, executor driver, partition 1, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 667.0 (TID 586, localhost, executor driver, partition 2, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 667.0 (TID 587, localhost, executor driver, partition 3, PROCESS_LOCAL, 8094 bytes)
    [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 667.0 (TID 584)
    [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 667.0 (TID 585)
    [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 667.0 (TID 587)
    [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 667.0 (TID 586)
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_0 locally
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_3 locally
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_2 locally
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_0 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_3 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_2 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_1 locally
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_1 in memory on localhost:43437 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 667.0 (TID 584). 59881 bytes result sent to driver
    [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 667.0 (TID 586). 59881 bytes result sent to driver
    [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 667.0 (TID 587). 59881 bytes result sent to driver
    [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 667.0 (TID 585). 59881 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 667.0 (TID 584) in 15 ms on localhost (executor driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 667.0 (TID 586) in 15 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 667.0 (TID 587) in 15 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 667.0 (TID 585) in 17 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 667.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 667 (foreach at UnboundedDataset.java:79) finished in 0.025 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 31 finished: foreach at UnboundedDataset.java:79, took 0.133426 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542068604500 ms.3 from job set of time 1542068604500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.888 s for time 1542068604500 ms (execution: 0.560 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@255a7af4{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@31769a32{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@1b8a8e67{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@1b084747{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-1] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 289 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-4fa7e155-c822-4b8a-a3ce-38e1805f928e

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@41f908f9{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-7f5f02e2-fe2e-4317-bb01-6c5549cae9dc
Finished generating test XML results (0.125 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.114 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 10 mins 29.105 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 41s
43 actionable tasks: 39 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/xzs5zs6d452uc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2181

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2181/display/redirect>

------------------------------------------
[...truncated 29.72 MB...]
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2472 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2500 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2509 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 31 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 667 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 658, ShuffleMapStage 662, ShuffleMapStage 651, ShuffleMapStage 666, ShuffleMapStage 660, ShuffleMapStage 637, ShuffleMapStage 664, ShuffleMapStage 653)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 658)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127 stored as values in memory (estimated size 177.5 KB, free 13.4 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 54.7 KB, free 13.4 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:33319 (size: 54.7 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 127 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 656.0 with 4 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 656.0 (TID 570, localhost, executor driver, partition 0, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 656.0 (TID 571, localhost, executor driver, partition 1, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 656.0 (TID 572, localhost, executor driver, partition 2, PROCESS_LOCAL, 8127 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 656.0 (TID 573, localhost, executor driver, partition 3, PROCESS_LOCAL, 8127 bytes)
    [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 656.0 (TID 571)
    [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 656.0 (TID 572)
    [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 656.0 (TID 570)
    [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 656.0 (TID 573)
    [Executor task launch worker for task 571] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_1 locally
    [Executor task launch worker for task 573] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_3 locally
    [Executor task launch worker for task 572] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally
    [Executor task launch worker for task 570] INFO org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally
    [Executor task launch worker for task 571] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 656.0 (TID 571). 59509 bytes result sent to driver
    [Executor task launch worker for task 573] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 656.0 (TID 573). 59509 bytes result sent to driver
    [Executor task launch worker for task 572] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 656.0 (TID 572). 59509 bytes result sent to driver
    [Executor task launch worker for task 570] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 656.0 (TID 570). 59509 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 656.0 (TID 571) in 13 ms on localhost (executor driver) (1/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 656.0 (TID 573) in 13 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 656.0 (TID 572) in 13 ms on localhost (executor driver) (3/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 656.0 (TID 570) in 14 ms on localhost (executor driver) (4/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 656.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 656 (mapToPair at GroupCombineFunctions.java:57) finished in 0.022 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667, ShuffleMapStage 657)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128 stored as values in memory (estimated size 211.7 KB, free 13.4 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 63.3 KB, free 13.4 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:33319 (size: 63.3 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 128 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 657 (MapPartitionsRDD[2500] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 657.0 with 5 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 657.0 (TID 574, localhost, executor driver, partition 0, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 657.0 (TID 575, localhost, executor driver, partition 1, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 657.0 (TID 576, localhost, executor driver, partition 2, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 657.0 (TID 577, localhost, executor driver, partition 3, PROCESS_LOCAL, 8376 bytes)
    [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 657.0 (TID 577)
    [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 657.0 (TID 575)
    [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 657.0 (TID 574)
    [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 657.0 (TID 576)
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_2 locally
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_3 locally
    [Executor task launch worker for task 577] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_3 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [Executor task launch worker for task 576] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_2 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_3 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_2 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_0 locally
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.BlockManager - Found block rdd_2169_1 locally
    [Executor task launch worker for task 575] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_1 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [Executor task launch worker for task 574] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_0 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_1 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2484_0 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 577] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 657.0 (TID 577). 59940 bytes result sent to driver
    [Executor task launch worker for task 576] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 657.0 (TID 576). 59940 bytes result sent to driver
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 657.0 (TID 578, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 657.0 (TID 578)
    [Executor task launch worker for task 574] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 657.0 (TID 574). 59940 bytes result sent to driver
    [Executor task launch worker for task 575] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 657.0 (TID 575). 59940 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 657.0 (TID 577) in 16 ms on localhost (executor driver) (1/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 657.0 (TID 576) in 16 ms on localhost (executor driver) (2/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 657.0 (TID 574) in 17 ms on localhost (executor driver) (3/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 657.0 (TID 575) in 18 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 578] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 657.0 (TID 578). 59467 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 657.0 (TID 578) in 16 ms on localhost (executor driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 657.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 657 (mapToPair at GroupCombineFunctions.java:57) finished in 0.039 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 658, ResultStage 667)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129 stored as values in memory (estimated size 213.0 KB, free 13.4 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 63.2 KB, free 13.4 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:33319 (size: 63.2 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 658 (MapPartitionsRDD[2509] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 658.0 with 5 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 658.0 (TID 579, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 658.0 (TID 580, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 658.0 (TID 581, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 658.0 (TID 582, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 658.0 (TID 579)
    [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 658.0 (TID 580)
    [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 658.0 (TID 581)
    [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 658.0 (TID 582)
    [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 579] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 582] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 580] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 581] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 582] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 658.0 (TID 582). 59896 bytes result sent to driver
    [Executor task launch worker for task 579] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 658.0 (TID 579). 59896 bytes result sent to driver
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 658.0 (TID 583, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 658.0 (TID 583)
    [Executor task launch worker for task 580] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 658.0 (TID 580). 59896 bytes result sent to driver
    [Executor task launch worker for task 581] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 658.0 (TID 581). 59896 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 658.0 (TID 579) in 14 ms on localhost (executor driver) (1/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 658.0 (TID 582) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 658.0 (TID 580) in 15 ms on localhost (executor driver) (3/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 658.0 (TID 581) in 15 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 583] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 583] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 658.0 (TID 583). 59853 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 658.0 (TID 583) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 658.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 658 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 667)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130 stored as values in memory (estimated size 187.9 KB, free 13.4 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 57.8 KB, free 13.4 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:33319 (size: 57.8 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 667 (MapPartitionsRDD[2529] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 667.0 with 4 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 667.0 (TID 584, localhost, executor driver, partition 0, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 667.0 (TID 585, localhost, executor driver, partition 1, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 667.0 (TID 586, localhost, executor driver, partition 2, PROCESS_LOCAL, 8094 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 667.0 (TID 587, localhost, executor driver, partition 3, PROCESS_LOCAL, 8094 bytes)
    [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 667.0 (TID 586)
    [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 667.0 (TID 585)
    [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 667.0 (TID 587)
    [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 667.0 (TID 584)
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_3 locally
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_0 locally
    [Executor task launch worker for task 584] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_0 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [Executor task launch worker for task 587] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_3 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_3 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_2 locally
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.BlockManager - Found block rdd_2197_1 locally
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_0 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 585] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_1 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [Executor task launch worker for task 586] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_2 stored as bytes in memory (estimated size 4.0 B, free 13.4 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_2 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2512_1 in memory on localhost:33319 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 587] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 667.0 (TID 587). 59881 bytes result sent to driver
    [Executor task launch worker for task 584] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 667.0 (TID 584). 59881 bytes result sent to driver
    [Executor task launch worker for task 586] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 667.0 (TID 586). 59881 bytes result sent to driver
    [Executor task launch worker for task 585] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 667.0 (TID 585). 59881 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 667.0 (TID 584) in 14 ms on localhost (executor driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 667.0 (TID 587) in 14 ms on localhost (executor driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 667.0 (TID 585) in 15 ms on localhost (executor driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 667.0 (TID 586) in 15 ms on localhost (executor driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 667.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 667 (foreach at UnboundedDataset.java:79) finished in 0.024 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 31 finished: foreach at UnboundedDataset.java:79, took 0.127969 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542067610500 ms.3 from job set of time 1542067610500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 7.188 s for time 1542067610500 ms (execution: 0.540 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@3fd82786{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@57c784fc{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@5f05e8ae{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@1a70216f{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 289 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-cd34cceb-8cf0-4168-afcb-d53a903a9e82

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@15cde176{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-2c9064c1-82f3-4ab4-bdd0-0fb09fa77c8c
Finished generating test XML results (0.165 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.196 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
:beam-runners-spark:validatesRunnerStreaming (Thread[Daemon worker,5,main]) completed. Took 10 mins 28.155 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 5s
43 actionable tasks: 39 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/q5zum2na76mmy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2180

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2180/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

------------------------------------------
[...truncated 29.55 MB...]
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2824 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 825, ShuffleMapStage 829, ShuffleMapStage 836, ShuffleMapStage 819, ShuffleMapStage 823, ShuffleMapStage 838, ShuffleMapStage 827, ShuffleMapStage 821, ShuffleMapStage 831)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 836)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 834 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.6 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:41997 (size: 54.6 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 143 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 834 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 834.0 with 4 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 834.0 (TID 642, localhost, executor driver, partition 0, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 834.0 (TID 643, localhost, executor driver, partition 1, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 834.0 (TID 644, localhost, executor driver, partition 2, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 834.0 (TID 645, localhost, executor driver, partition 3, PROCESS_LOCAL, 8165 bytes)
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 834.0 (TID 642)
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 834.0 (TID 643)
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 834.0 (TID 644)
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 834.0 (TID 645)
    [Executor task launch worker for task 645] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
    [Executor task launch worker for task 643] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
    [Executor task launch worker for task 642] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
    [Executor task launch worker for task 644] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 834.0 (TID 645). 59509 bytes result sent to driver
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 834.0 (TID 642). 59509 bytes result sent to driver
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 834.0 (TID 644). 59509 bytes result sent to driver
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 834.0 (TID 643). 59509 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 834.0 (TID 645) in 12 ms on localhost (executor driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 834.0 (TID 644) in 13 ms on localhost (executor driver) (2/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 834.0 (TID 643) in 13 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 834.0 (TID 642) in 13 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 834.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 834 (mapToPair at GroupCombineFunctions.java:57) finished in 0.019 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 836, ShuffleMapStage 835, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 835 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:41997 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 144 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 835 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 835.0 with 5 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 835.0 (TID 646, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 835.0 (TID 647, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 835.0 (TID 648, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 835.0 (TID 649, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes)
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 835.0 (TID 646)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 835.0 (TID 648)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 835.0 (TID 649)
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 835.0 (TID 647)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 835.0 (TID 649). 59940 bytes result sent to driver
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 835.0 (TID 646). 59940 bytes result sent to driver
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 835.0 (TID 647). 59940 bytes result sent to driver
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 835.0 (TID 648). 59940 bytes result sent to driver
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 835.0 (TID 650, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 835.0 (TID 650)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 835.0 (TID 649) in 14 ms on localhost (executor driver) (1/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 835.0 (TID 647) in 15 ms on localhost (executor driver) (2/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 835.0 (TID 646) in 15 ms on localhost (executor driver) (3/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 835.0 (TID 648) in 15 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 835.0 (TID 650). 59467 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 835.0 (TID 650) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 835.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 835 (mapToPair at GroupCombineFunctions.java:57) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 836, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 836 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:41997 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 836 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 836.0 with 5 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 836.0 (TID 651, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 836.0 (TID 652, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 836.0 (TID 653, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 836.0 (TID 654, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 836.0 (TID 651)
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 836.0 (TID 653)
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 836.0 (TID 654)
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 836.0 (TID 652)
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 836.0 (TID 654). 59896 bytes result sent to driver
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 836.0 (TID 652). 59896 bytes result sent to driver
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 836.0 (TID 653). 59896 bytes result sent to driver
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 836.0 (TID 651). 59853 bytes result sent to driver
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 836.0 (TID 655, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 836.0 (TID 655)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 836.0 (TID 654) in 14 ms on localhost (executor driver) (1/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 836.0 (TID 652) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 836.0 (TID 651) in 14 ms on localhost (executor driver) (3/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 836.0 (TID 653) in 14 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 836.0 (TID 655). 59853 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 836.0 (TID 655) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 836.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 836 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:41997 (size: 58.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, partition 0, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, partition 1, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, partition 2, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, partition 3, PROCESS_LOCAL, 8132 bytes)
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657)
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659)
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658)
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:41997 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 659). 59881 bytes result sent to driver
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 657). 59881 bytes result sent to driver
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 658). 59881 bytes result sent to driver
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 656). 59881 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 839.0 (TID 657) in 13 ms on localhost (executor driver) (1/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 839.0 (TID 659) in 13 ms on localhost (executor driver) (2/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 839.0 (TID 656) in 14 ms on localhost (executor driver) (3/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 839.0 (TID 658) in 14 ms on localhost (executor driver) (4/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 839.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.021 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:79, took 0.119489 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542062965500 ms.3 from job set of time 1542062965500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.231 s for time 1542062965500 ms (execution: 0.479 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@8fb9762{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@afa695d{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@4be1012f{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@2cad9a3e{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-1] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 291 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-a4d92348-a8c7-4c28-a7b6-8e028dff2e39

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@61ffbeff{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 294 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-dc420450-c9ad-4be8-9c4d-40c67d12679e
Finished generating test XML results (0.148 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.135 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 4,5,main]) completed. Took 10 mins 27.148 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 23s
43 actionable tasks: 43 executed

Publishing build scan...
https://gradle.com/s/5n6ht4kubupwg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2179

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2179/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

------------------------------------------
[...truncated 29.79 MB...]
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:79
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2824 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 832, ShuffleMapStage 818, ShuffleMapStage 836, ShuffleMapStage 822, ShuffleMapStage 816, ShuffleMapStage 834, ShuffleMapStage 820, ShuffleMapStage 838, ShuffleMapStage 824)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 832)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 829 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.6 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:44567 (size: 54.6 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 143 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 829 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 829.0 with 4 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 829.0 (TID 642, localhost, executor driver, partition 0, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 829.0 (TID 643, localhost, executor driver, partition 1, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 829.0 (TID 644, localhost, executor driver, partition 2, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 829.0 (TID 645, localhost, executor driver, partition 3, PROCESS_LOCAL, 8165 bytes)
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 829.0 (TID 643)
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 829.0 (TID 644)
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 829.0 (TID 642)
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 829.0 (TID 645)
    [Executor task launch worker for task 645] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
    [Executor task launch worker for task 642] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
    [Executor task launch worker for task 643] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
    [Executor task launch worker for task 644] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
    [Executor task launch worker for task 645] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 829.0 (TID 645). 59466 bytes result sent to driver
    [Executor task launch worker for task 644] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 829.0 (TID 644). 59466 bytes result sent to driver
    [Executor task launch worker for task 642] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 829.0 (TID 642). 59466 bytes result sent to driver
    [Executor task launch worker for task 643] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 829.0 (TID 643). 59466 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 829.0 (TID 645) in 12 ms on localhost (executor driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 829.0 (TID 642) in 13 ms on localhost (executor driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 829.0 (TID 644) in 12 ms on localhost (executor driver) (3/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 829.0 (TID 643) in 12 ms on localhost (executor driver) (4/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 829.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 829 (mapToPair at GroupCombineFunctions.java:57) finished in 0.020 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 832, ResultStage 839, ShuffleMapStage 831)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 831 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:44567 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 144 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 831 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 831.0 with 5 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 831.0 (TID 646, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 831.0 (TID 647, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 831.0 (TID 648, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 831.0 (TID 649, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes)
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 831.0 (TID 647)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 831.0 (TID 648)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 831.0 (TID 649)
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 831.0 (TID 646)
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally
    [Executor task launch worker for task 647] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 648] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 646] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 648] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 831.0 (TID 648). 59940 bytes result sent to driver
    [Executor task launch worker for task 647] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 831.0 (TID 647). 59940 bytes result sent to driver
    [Executor task launch worker for task 646] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 831.0 (TID 646). 59940 bytes result sent to driver
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 831.0 (TID 650, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 831.0 (TID 650)
    [Executor task launch worker for task 649] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 831.0 (TID 649). 59940 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 831.0 (TID 648) in 15 ms on localhost (executor driver) (1/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 831.0 (TID 647) in 15 ms on localhost (executor driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 831.0 (TID 646) in 15 ms on localhost (executor driver) (3/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 831.0 (TID 649) in 15 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 650] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 831.0 (TID 650). 59467 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 831.0 (TID 650) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 831.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 831 (mapToPair at GroupCombineFunctions.java:57) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ShuffleMapStage 832, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 832 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:44567 (size: 64.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 832 (MapPartitionsRDD[2824] at mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 832.0 with 5 tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 832.0 (TID 651, localhost, executor driver, partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 832.0 (TID 652, localhost, executor driver, partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 832.0 (TID 653, localhost, executor driver, partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 832.0 (TID 654, localhost, executor driver, partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 832.0 (TID 651)
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 832.0 (TID 652)
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 832.0 (TID 653)
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 832.0 (TID 654)
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 651] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 652] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 653] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 651] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 832.0 (TID 651). 59896 bytes result sent to driver
    [Executor task launch worker for task 652] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 832.0 (TID 652). 59853 bytes result sent to driver
    [Executor task launch worker for task 654] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 832.0 (TID 654). 59896 bytes result sent to driver
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 832.0 (TID 655, localhost, executor driver, partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 832.0 (TID 655)
    [Executor task launch worker for task 653] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 832.0 (TID 653). 59853 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 832.0 (TID 651) in 14 ms on localhost (executor driver) (1/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 832.0 (TID 652) in 14 ms on localhost (executor driver) (2/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 832.0 (TID 654) in 14 ms on localhost (executor driver) (3/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 832.0 (TID 653) in 15 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 655] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 655] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 832.0 (TID 655). 59896 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 832.0 (TID 655) in 13 ms on localhost (executor driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 832.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 832 (mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:44567 (size: 58.1 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, partition 0, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, partition 1, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, partition 2, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, partition 3, PROCESS_LOCAL, 8132 bytes)
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657)
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656)
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658)
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 1 ms
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
    [Executor task launch worker for task 656] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 657] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 658] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 659] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 656] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 656). 59881 bytes result sent to driver
    [Executor task launch worker for task 657] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 657). 59881 bytes result sent to driver
    [Executor task launch worker for task 658] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 658). 59881 bytes result sent to driver
    [Executor task launch worker for task 659] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 659). 59881 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 839.0 (TID 656) in 15 ms on localhost (executor driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 839.0 (TID 657) in 15 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 839.0 (TID 658) in 15 ms on localhost (executor driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 839.0 (TID 659) in 15 ms on localhost (executor driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 839.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.023 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:79, took 0.121113 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1542061732000 ms.3 from job set of time 1542061732000 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.420 s for time 1542061732000 ms (execution: 0.565 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@c79524a{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@3f5303bf{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@2eb7f2ac{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@7179880a{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 291 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-16463eff-0a71-4aa1-8204-b689e79e3631

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@61ffbeff{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-9b90677a-c0f0-4d5b-9f57-360ebfc356f5
Finished generating test XML results (0.13 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.109 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 10 mins 26.931 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 2s
43 actionable tasks: 39 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/s4xckma4gkvws

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org